hive2.3.4配置

将hive从1.2.0升级到2.3.4;

下载,解压就忽略了,主要是配置,和原来hive的环境变量一定要记得修改.

  • cp hive-log4j2.properties.template hive-log4j2.properties

  • cp  hive-default.xml.template  hive-site.xml

vim hive-site.xml或者用everEdit工具打开修改:

找到如下配置进行修改:

#hive metastore的一些配置(远程模式)

<property>

<name>hive.metastore.warehouse.dir</name>

<value>/user/hive/warehouse</value>

</property>

<property>

<name>hive.metastore.uris</name>

<value>thrift://hadoop03:9083</value>

</property>

<property>

<name>javax.jdo.option.ConnectionURL</name>

<value>jdbc:mysql://hadoop03:3306/hive2?createDatabaseIfNotExist=true</value>

</property>

<property>

<name>javax.jdo.option.ConnectionDriverName</name>

<value>com.mysql.jdbc.Driver</value>

</property>

<property>

<name>javax.jdo.option.ConnectionUserName</name>

<value>root</value>

</property>

<property>

<name>javax.jdo.option.ConnectionPassword</name>

<value>root</value>

</property>

 

#hive.server2.thrift的一些配置

<property>

<name>hive.server2.thrift.bind.host</name>

<value>hadoop03</value>

<description>Bind host on which to run the HiveServer2 Thrift service.</description>

</property>

<property>

<name>hive.server2.thrift.port</name>

<value>20000</value>

<description>Port number of HiveServer2 Thrift interface when hive.server2.transport.mode is 'binary'.</description>

</property>

<property>

<name>hive.server2.thrift.http.port</name>

<value>20001</value>

<description>Port number of HiveServer2 Thrift interface when hive.server2.transport.mode is 'http'.</description>

</property>

 

#用户信息

<property>

<name>hive.server2.thrift.client.user</name>

<value>root</value>

<description>Username to use against thrift client</description>

</property>

<property>

<name>hive.server2.thrift.client.password</name>

<value>root</value>

<description>Password to use against thrift client</description>

</property>

保存;

  • 修改hive-env.sh:

# HADOOP_HOME=${bin}/../../hadoop
HADOOP_HOME=/home/hadoop-2.7.1
# Hive Configuration Directory can be controlled by:
# export HIVE_CONF_DIR=
export HIVE_CONF_DIR=/home/software/apache-hive-2.3.4-bin/conf

# Folder containing extra libraries required for hive compilation/execution can be controlled by:
# export HIVE_AUX_JARS_PATH=
export HIVE_AUX_JARS_PATH=/home/software/apache-hive-2.3.4-bin/lib

别忘了将HIVE_HOME写入环境变量;

 

填坑:

 

先将原来在hdfs的hive目录删除后;

在hdfs上执行

hadoop fs -mkdir /tmp

hadoop fs -mkdir /user/hive/warehouse

hadoop fs -chmod g+w /tmp

hadoop fs -chmod g+w /user/hive/warehouse

 

启动bin/hive

 

[root@hadoop03 bin]# ./hive

which: no hbase in (/usr/java/jdk1.8.0_111/bin:/home/hadoop-2.7.1/bin:/home/hadoop-2.7.1/sbin:/home/software/zookeeper-3.4.7/bin:/home/so  ftware/sqoop1.4.7/bin:/usr/java/jdk1.8.0_111/bin:/home/hadoop-2.7.1/bin:/home/hadoop-2.7.1/sbin:/home/software/zookeeper-3.4.7/bin:/home/  software/sqoop1.4.6/bin:/usr/java/jdk1.8.0_111/bin:/home/hadoop-2.7.1/bin:/home/hadoop-2.7.1/sbin:/home/software/zookeeper-3.4.7/bin:/hom  e/software/sqoop-1.4.4.bin__hadoop-2.0.4-alpha/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin)

出现上面问题是找不到hbase, 将Hbase_home写到环境变量即可.

再次启动./hive:

Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java. io.tmpdir%7D/$%7Bsystem:user.name%7D

出现上述错误:

在hive-site.xml中找到所有的system:java. io.tmpdir项配置修改到自己指定的路径:

 <property>

    <name>hive.exec.local.scratchdir</name>

    <!--<value>${system:java.io.tmpdir}/${system:user.name}</value>-->

    <value>/tmp/hive/local</value>

    <description>Local scratch space for Hive jobs</description>

  </property>

  <property>

    <name>hive.downloaded.resources.dir</name>

    <!--<value>${system:java.io.tmpdir}/${hive.session.id}_resources</value>-->

    <value>/tmp/hive/resources</value>

    <description>Temporary local directory for added resources in the remote file system.</description>

  </property>

<property>

    <name>hive.querylog.location</name>

    <!--<value>${system:java.io.tmpdir}/${system:user.name}</value>-->

     <value>/home/software/apache-hive-2.3.4-bin/logs</value>

    <description>Location of Hive run time structured log file</description>

  </property>

 <property>

    <name>hive.server2.logging.operation.log.location</name>

    <!--<value>${system:java.io.tmpdir}/${system:user.name}/operation_logs</value>-->

    <value>/home/software/apache-hive-2.3.4-bin/operation_logs</value>

    <description>Top level directory where operation logs are stored if logging functionality is enabled</description>

  </property>

 

在bin/目录下也有这样一个目录:

 

再次启动./hive:

Logging initialized using configuration in file:/home/software/apache-hive-1.2.0-bin/conf/hive-log4j.properties

Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

不能实例化SessionHiveMetaStoreClient; 原因是hive2.3.4 启动./hive时需要先启动metastore(需要先初始化), 通过./hive --service metastore & 后台运行

再次启动./hive成功.

 

运行HiveServer2和Beeline

参考hive官网向导:https://cwiki.apache.org/confluence/display/Hive/GettingStarted

在运行启动hiveServer2之前需要运行schematool命令进行初始化.

Hive 2.1开始,我们需要在下面运行schematool命令作为初始化步骤。例如,我们可以使用“mysql”作为db类型,需要在hive-site.xml中配置mysql作为hive的元数据存储

 

HIVE_HOME / bin / schematool -dbType <db type> -initSchema

如:   ./schematool -dbType mysql -initSchema

 

启动metastore

./hive --service metastore &

启动hiveserver2  使用beeline

./hive --service hiveserver2 &

./beeline -u jdbc:hive2://hadoop03:20000/ -nroot -proot

 

发现Error: Could not open client transport with JDBC Uri: jdbc:hive2://hadoop03:20000: java.net.ConnectException: 拒绝连接 (Connection refuse  d) (state=08S01,code=0)的错误;

百度说是hadoop的权限问题,修改core-site.xml中相关配置,等等还有其他的

https://blog.csdn.net/lsttoy/article/details/53490144

https://blog.csdn.net/adorechen/article/details/79158807

https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/Superusers.html

发现我的并不是这些原因.

通过lsof -i:20000或netstat -nao |grep 20000查看是否监听到20000端口,发现没有,jps查看有时启动成功了的;奇怪的是10000端口反而监听到了,然后通过./beeline -u jdbc:hive2://hadoop03:10000/ -nroot -proot居然成功连上了,居然没有兴奋的劲儿,难道我的配置没有生效吗???

通过ps -ef | grep hive 查看:

 

发现还是hive-service-1.2.0.jar , 原来是我忘记修改HIVE_HOME的路径,还是原来的;启动metastore和hiveserver2还是寻找原来hive1.2.0的配置.

故修改HIVE_HOME的路径后,重启metastore和hiveserver2,

ps -ef | grep hive查看

正常了

./beeline -u jdbc:hive2://hadoop03:20000/ -nroot -proot

连接成功.

 

 

 

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值