hive3.1.1.安装

安装步骤

本篇文章是基于hadoop3.2.0版本的hive安装。
下载链接:http://mirrors.shu.edu.cn/apache/

解压hive压缩包

[hadoop@master ~]$ tar -zxvf apache-hive-3.1.1-bin.tar.gz

修改hive目录名字(个人习惯)

[hadoop@master ~]$ mv apache-hive-3.1.1-bin hive-3.1.1

hive环境变量配置

[hadoop@master ~]$ vi .bash_profile
PATH=$PATH:$HOME/bin
export HADOOP_HOME=/home/hadoop/hadoop-3.2.0
export HIVE_HOME=/home/hadoop/hive-3.1.1
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HIVE_HOME/bin

使环境变量生效

[hadoop@master ~]$ source .bash_profile

hive配置文件更改

配置hive-config.sh

[hadoop@master ~]$ vi hive-3.1.1/bin/hive-config.sh
末尾追加
export JAVA_HOME=/usr/java/jdk1.8.0_201-amd64
export HIVE_HOME=/home/hadoop/hive-3.1.1
export HADOOP_HOME=/home/hadoop/hadoop-3.2.0

配置hive-site.xml文件

[hadoop@master ~]$ cd hive-3.1.1/conf/
[hadoop@master conf]$ cp hive-default.xml.template hive-site.xml

hive配置mysql元数据库

关于linux中建立mysql库,参见我另一篇文章:

https://blog.csdn.net/genus_yang/article/details/87939556

在mysql中建立hive用户

[root@master ~]# mysql -u root -p123
mysql> create user 'hive' identified by '123';
mysql> grant all privileges on *.* to 'hive'@'%' with grant option;
mysql> grant all privileges on *.* to hive@master identified by '123';
mysql> flush privileges;

建立hive专用的元数据库

[root@master ~]# mysql -h 169.254.1.100 -u hive -p123
mysql> create database hive;

更改hive配置文件hive-site.xml

[hadoop@master ~]$ vi hive-3.1.1/conf/hive-site.xml
末尾</configuration>之前追加
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://169.254.1.100:3306/hive?characterEncoding=UTF-8</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123</value>
</property>
<property>
<name>datanucleus.schema.autoCreateAll</name>
<value>true</value>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/home/hadoop/hive-3.1.1/tmp</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/home/hadoop/hive-3.1.1/tmp/resources</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>/home/hadoop/hive-3.1.1/tmp</value>
<description>Location of Hive run time structured log file</description>
</property>
<property>
<name>hive.server2.logging.operation.log.location</name>
<value>/home/hadoop/hive-3.1.1/tmp/operation_logs</value>
<description>Top level directory where operation logs are stored if logging functitonality is enabled</description>
</property>

在hive目录中创建tmp临时目录

[hadoop@master ~]$ mkdir /home/hadoop/hive-3.1.1/tmp

将msql的JDBC复制到hive的lib下

[hadoop@master ~]$ cp mysql-connector-java-5.1.47.jar hive-3.1.1/lib/

下载链接地址:http://central.maven.org/maven2/mysql/mysql-connector-java/
Connector/J版本和mysql对应关系链接地址https://www.cnblogs.com/peijie-tech/articles/4446011.html

Connnector/J versionDriver TypeJDBC versionMySQL ServerStatus
5.143.0,4.04.1,5.0,5.1,5.5Recommended version
5.043.04.1,5.0Released version
3.143.04.1,5.0Obsolete
3.043.03.x,4.1Obsolete

hive元数据库初始化

[hadoop@master ~]$ schematool -dbType mysql -initSchema

错误为:

Caused by: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
at [row,col,system-id]: [3210,96,“file:/home/hadoop/hive-3.1.1/conf/hive-site.xml”]

意思是hive-site.xml的3210行有非法字符

3209 <description>
3210 Ensures commands with OVERWRITE (such as INSERT OVERWRITE) acquire Exclusive locks for�transactional tables. This ensures that inserts (w/o overwrite) run ning concurrently
3211 are not hidden by the INSERT OVERWRITE.
3212 </description>

删除非法字符重新初始化

[hadoop@master ~]$ schematool -dbType mysql -initSchema
Closing: 0: jdbc:mysql://169.254.1.100:3306/hive?characterEncoding=UTF-8
beeline>
beeline> Initialization script completed
schemaTool completed

表示初始化完成

hive安装成功验证

在这里插入图片描述
在mysql查看hive元数据情况
在这里插入图片描述
可以看到hive库中有很多元数据相关的表,表示hive安装成功!

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值