1 下载Hive
https://mirrors.cnnic.cn/apache/hive/
2 配置
2.1 创建MySQL数据库
create database db_hive312;
2.2 新建配置文件hive-site.xml
路径:
your_path/conf/
如:
/home/xindaqi/software/install/apache-hive-3.1.2-bin/conf
vim hive-site.xml
添加配置:
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>datanucleus.schema.autoCreateAll</name>
<value>true</value>
</property>
<property>
<name>hive.metastore.local</name>
<value>true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/db_hive312?characterEncoding=UTF-8&useSSL=false&serverTimezone=Asia/Shanghai</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.cj.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
</property>
</configuration>
2.3 配置日志
创建日志文件夹
cd /home/xindaqi/software/install/apache-hive-3.1.2-bin
mkdir logs
-
结果
/home/xindaqi/software/install/apache-hive-3.1.2-bin/logs -
配置文件hive-log4j2.properties
进入conf文件夹,找到hive-log4j2.properties.template文件
新建hive-log4j2.properties
vim hive-log4j2.properties
在conf文件夹,找到hive-log4j2.properties.template文件
复制hive-log4j2.properties.template内容到hive-log4j2.properties文件
修改日志路径为新建的日志路径
property.hive.log.dir = /home/xindaqi/software/install/apache-hive-3.1.2-bin
2.4 配置hive环境变量
进入conf文件夹,新建文件hive-env.sh
vim hive-env.sh
# Hadoop路径
HADOOP_HOME=/home/xindaqi/software/install/hadoop-3.3.0
# hive配置文件路径
export HIVE_CONF_DIR=/home/xindaqi/software/install/apache-hive-3.1.2-bin/conf
# java路径
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
2.5 配置执行日志
进入conf文件夹,新建hive-exec-log4j2.properties文件
vim hive-exec-log4j2.properties
在conf文件夹找到hive-exec-log4j2.properties.template,复制内容到hive-exec-log4j2.properties文件,修改如下:
property.hive.log.dir = /home/xindaqi/software/install/apache-hive-3.1.2-bin
2.6 添加mysql驱动
进入/home/xindaqi/software/install/apache-hive-3.1.2-bin/lib路径,添加MySQL驱动。
驱动下载链接:
https://download.csdn.net/download/Xin_101/19361262?spm=1001.2014.3001.5501
2.7 配置hiveserver2
在hive-site.xml中添加属性,其中xindaqi为hiveserver2的用户名,与hadoop中的用户名保持一致,否则无法创建数据库和表。
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
</property>
<property>
<name>hive.server2.thrift.bind.host</name>
<value>192.168.211.129</value>
</property>
<property>
<name>hive.server2.thrift.client.user</name>
<value>xindaqi</value>
<description>Username to use against thrift client</description>
</property>
<property>
<name>hive.server2.thrift.client.password</name>
<value>123456</value>
<description>Password to use against thrift client</description>
</property>
2.8 配置hadoop core-site.xml
在etc/hadoop中找到core-site.xml添加属性。
其中,xindaqi为hadoop的用户名。
<property>
<name>hadoop.proxyuser.xindaqi.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.xindaqi.groups</name>
<value>*</value>
</property>
3 初始化Hive
进入bin路径:
./schematool -initSchema -dbType mysql
4 启动
4.1 启动hiveserver2
4.2 启动hive
4.2.1 通过hive启动
进入bin路径,执行命令进入hive
4.2.2 通过beeline启动
- 进入hive
./beeline
- 连接hive服务
beeline> !connect jdbc:hive2://192.168.211.129:10000
- 输入用户名和密码
用户名和密码:
xindaqi
123456 - 查看数据库
show databases
4.3 登录hive页面
5 问题记录
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/xindaqi/software/install/apache-hive-3.1.2-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/xindaqi/software/install/hadoop-3.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread “main” java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:5104)
at org.apache.hive.beeline.HiveSchemaTool.(HiveSchemaTool.java:96)
at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
- 原因
guava版本不一致 - 方案
复制hadoop的guava-27.0-jre.jar到hive
hadoop guava路径:
/home/xindaqi/software/install/hadoop-3.3.0/share/hadoop/common/lib
hive guava路径:
/home/xindaqi/software/install/apache-hive-3.1.2-bin/lib
【参考文献】
[]https://blog.csdn.net/weixin_41485724/article/details/105840716
[]https://blog.csdn.net/xp_lx1/article/details/99634300
[]https://www.cnblogs.com/lenmom/p/11218807.html
[]https://blog.csdn.net/Xin_101/article/details/117531959?spm=1001.2014.3001.5501
[]https://blog.csdn.net/Xin_101/article/details/117444530?spm=1001.2014.3001.5501