hive 配置
下载hive
本次使用版本 2.3.0
tar -zxvf apache-hive-2.3.0-bin.tar.gz
cd /opt/apache-hive-2.3.0-bin
拷贝mysql 驱动jar 到lib目录下
mkdir logs
mkdir tmp
hive配置文件
cp /opt/apache-hive-2.3.0-bin/conf/hive-default.xml.template /opt/apache-hive-2.3.0-bin/conf/hive-site.xml
cp hive-env.sh.template hive-env.sh
cp hive-default.xml.template hive-site.xml
cp hive-log4j2.properties.template hive-log4j2.properties
cp hive-exec-log4j2.properties.template hive-exec-log4j2.properties
修改hive-env.sh
export JAVA_HOME=/usr/local/jdk1.7.0_80 ##Java路径
export HADOOP_HOME=/usr/local/hadoop ##Hadoop安装路径
export HIVE_HOME=/usr/local/hive ##Hive安装路径
export HIVE_CONF_DIR=/usr/local/hive/conf ##Hive配置文件路径
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://192.168.1.161:3306/hive?useUnicode=true&characterEncoding=UTF-8</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://192.168.1.161:9083</value>
</property>
</configuration>
替换
${system:java.io.tmpdir} 替换为 /opt/apache-hive-2.3.0-bin/tmp
把 ${system:user.name} 改成 root
把MySQL的JDBC驱动包复制到hive的lib目录下
启动
#初始化hive
schematool -dbType mysql -initSchema
bin/hive --service metastore &
bin/hive –service hiveserver2 & 或者 nohup bin/hiveserver2 &
连接
bin/beeline -u jdbc:hive2://localhost:10000/ -n anonymous -p anonymous
bin/hive
netstat -nptl | grep 10000
启动 Hive Shell, 执行“show tables;”命令,如果不报错,表明基于独立元数据库的 Hive 已经安装成功了
报错
root is not allowed to impersonate root (state=08S01,code=0)
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
hive on spark
User class threw exception: java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS