文章目录
一、安装前准备
1.1 安装环境
[root@server1 ~]# cat /etc/redhat-release
CentOS Linux release 7.6.1810 (Core)
1.2 依赖服务
jdk1.8:https://blog.csdn.net/qq_39680564/article/details/82768938
mysql5.7:https://blog.csdn.net/qq_39680564/article/details/84943471
hadoop-3.0.3:https://blog.csdn.net/qq_39680564/article/details/89513162
hiveHQL:https://www.cnblogs.com/racin-job/p/8695174.html
二、下载安装包
cd /opt
wget http://archive.apache.org/dist/hive/hive-3.1.1/apache-hive-3.1.1-bin.tar.gz
tar -zxvf apache-hive-3.1.1-bin.tar.gz
mv apache-hive-3.1.1-bin hive
三、配置环境变量
vim ~/.bashrc
添加如下内容
# hive
export HIVE_HOME=/opt/hive
export PATH=$PATH:$HIVE_HOME/bin
刷新
source ~/.bashrc
如图

四、修改配置文件
4.1 创建hive-site.xml文件
cp /opt/hive/conf/hive-default.xml.template /opt/hive/conf/hive-site.xml
vim /opt/hive/conf/hive-site.xml
4.2 添加的配置为数据库连接配置(需要删掉文本中与之重复的原有配置)
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://192.168.1.100:3306/hive?useSSL=false</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>

4.3 设置临时目录
vim /opt/hive/conf/hive-site.xml
将这三处的${system:java.io.tmpdir}替换为本地的目录,目录需要自己创建
<name>hive.exec.local.scratchdir</name>
<name>hive.downloaded.resources.dir</name>
<name>hive.server2.logging.operation.log.location</name>
mkdir -p /data/hive/tmp/
如图:



五、Hive启动
5.1 进入mysql数据库,创建hive数据库
create database hive;
5.2 将数据库驱动放入hive的lib目录
cd /opt/hive/lib/
wget http://central.maven.org/maven2/mysql/mysql-connector-java/5.1.47/mysql-connector-java-5.1.47.jar
5.3 进入hive的bin目录初始化
cd /opt/hive/bin
schematool -dbType mysql -initSchema
报错:
Exception in thread "main" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
at [row,col,system-id]: [3231,96,"file:/opt/hive/conf/hive-site.xml"]
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3125)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2894)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2769)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1423)
at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:4990)
at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:5063)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5150)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5098)
at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
Caused by: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
at [row,col,system-id]: [3231,96,"file:/opt/hive/conf/hive-site.xml"]
at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621)
at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491)
at com.ctc.wstx.sr.StreamScanner.reportIllegalChar(StreamScanner.java:2456)
at com.ctc.wstx.sr.StreamScanner.validateChar(StreamScanner.java:2403)
at com.ctc.wstx.sr.StreamScanner.resolveCharEnt(StreamScanner.java:2369)
at com.ctc.wstx.sr.StreamScanner.fullyResolveEntity(StreamScanner.java:1515)
at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2828)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2960)
... 15 more
解决方法:opt/hive/conf/hive-site.xml文件的第3231行有特殊字符,删掉即可

重新初始化,提示成功
Initialization script completed
schemaTool completed
5.4 启动验证
查看版本:
[root@master bin]# hive --version
运行结果
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hive-3.1.1/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-3.0.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive 3.1.1
Git git://daijymacpro-2.local/Users/daijy/commit/hive -r f4e0529634b6231a0072295da48af466cf2f10b7
Compiled by daijy on Tue Oct 23 17:19:24 PDT 2018
From source with checksum 6deca5a8401bbb6c6b49898be6fcb80e
如图

启动hive,创建一个库:
[root@master bin]# hive
hive> create database db_liuli;
OK
Time taken: 0.043 seconds
查看
hive> show databases;
OK
db_liuli
default
Time taken: 0.019 seconds, Fetched: 2 row(s)
hdfs目录

六、异常分析
schematool -dbType mysql -initSchema初始化时
报错一:
Exception in thread "main" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
at [row,col,system-id]: [3231,96,"file:/opt/hive/conf/hive-site.xml"]
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3125)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2894)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2769)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1423)
at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:4990)
at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:5063)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5150)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5098)
at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
Caused by: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
at [row,col,system-id]: [3231,96,"file:/opt/hive/conf/hive-site.xml"]
at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621)
at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491)
at com.ctc.wstx.sr.StreamScanner.reportIllegalChar(StreamScanner.java:2456)
at com.ctc.wstx.sr.StreamScanner.validateChar(StreamScanner.java:2403)
at com.ctc.wstx.sr.StreamScanner.resolveCharEnt(StreamScanner.java:2369)
at com.ctc.wstx.sr.StreamScanner.fullyResolveEntity(StreamScanner.java:1515)
at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2828)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2960)
... 15 more

解决方法:opt/hive/conf/hive-site.xml文件的第3231行有特殊字符,删掉即可
vim /opt/hive/conf/hive-site.xml
:set nu
:3231
找到该字符:

报错二:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-3.0.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL: jdbc:derby:;databaseName=metastore_db;create=true
Metastore Connection Driver : org.apache.derby.jdbc.EmbeddedDriver
Metastore connection User: APP
Starting metastore schema initialization to 3.1.0
Initialization script hive-schema-3.1.0.mysql.sql
Error: Syntax error: Encountered "<EOF>" at line 1, column 64. (state=42X01,code=30000)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***

解决方法:删掉与添加的配置文件重名的配置信息
hive启动时报错
报错一:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-3.0.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = 6d8b8d61-66cb-40d2-a876-ac512c2cf0c2
Logging initialized using configuration in jar:file:/opt/hive/lib/hive-common-3.1.1.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at org.apache.hadoop.fs.Path.initialize(Path.java:259)
at org.apache.hadoop.fs.Path.<init>(Path.java:217)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:707)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:624)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:588)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:747)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at java.net.URI.checkPath(URI.java:1823)
at java.net.URI.<init>(URI.java:745)
at org.apache.hadoop.fs.Path.initialize(Path.java:256)
... 12 more

解决方法:配置没有修改hive的临时目录
修改hive-site.xml文件的${system:java.io.tmpdir},一共两处,改为本地的存在目录
vim /opt/hive/conf/hive-site.xml

本文详细介绍了在CentOS 7.6环境下安装Hive 3.1.1的步骤,包括环境准备、下载安装包、配置环境变量、修改配置文件、启动验证及常见异常分析。特别关注了依赖服务的安装链接、hive-site.xml的数据库连接配置、异常错误的解决方法等关键信息。
2036

被折叠的 条评论
为什么被折叠?



