关于我在Hadoop3.2.1集群安装hive3.1.2的过程记录

背景前提
已经安装好了hadoop3.2.1集群,但是没有安装hive,hadoop3.2.1是比较新的版本,在网上有不少hadoop3.2.x版本安装hive3.1.2所以我认为,这两个版本应该是兼容的,整体安装是比较简单的,但还是遇到一些问题,所以记录一下

  1. 安装hive
    下载地址:http://archive.apache.org/dist/hive/hive-3.1.2/
[root@slave1 ~]# tar -zxvf  apache-hive-3.1.2-bin.tar.gz 
[root@slave1 ~]# mv apache-hive-3.1.2-bin /usr/local/
[root@slave1 local]# mv apache-hive-3.1.2-bin/ /usr/local/hive-3.1.2

环境变量设置

vim /etc/profile

export PATH=$PATH:/usr/local/hive-3.1.2/bin
source /etc/profile

hive文件夹权限修改

mkdir /usr/local/hive-3.1.2/warehouse
hadoop fs -mkdir -p /usr/local/hive-3.1.2/warehouse
hadoop fs -chmod 777 /usr/local/hive-3.1.2/warehouse
hadoop fs -ls /usr/local/hive-3.1.2/

复制备份hive原配置文件

cd /usr/local/hive-3.1.2/conf/
cp hive-exec-log4j2.properties.template hive-exec-log4j2.properties
cp hive-log4j2.properties.template hive-log4j2.properties
cp hive-default.xml.template hive-default.xml
cp hive-default.xml.template hive-site.xml
cp hive-env.sh.template hive-env.sh

修改hive-env.sh文件

vim hive-env.sh

HADOOP_HOME=/你的hadoop地址
export HIVE_CONF_DIR=/usr/local/hive-3.1.2/conf
export HIVE_AUX_JARS_PATH=/usr/local/hive-3.1.2/lib

修改hive-site.xml文件

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
	<property>
		<name>javax.jdo.option.ConnectionURL</name>
		 <value>jdbc:mysql://你的mysql安装地址:3306/hive?createDatabaseIfNotExist=true&amp;useUnicode=true&amp;characterEncoding=UTF-8&amp;useSSL=false</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionDriverName</name>
		<value>com.mysql.jdbc.Driver</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionUserName</name>
		<value>root</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionPassword</name>
		<value>你的mysql密码</value>
	</property>
	<property>
		<name>datanucleus.readOnlyDatastore</name>
		<value>false</value>
	</property>
	<property>
		<name>datanucleus.fixedDatastore</name>
		<value>false</value>
	</property>
	<property>
		<name>datanucleus.autoCreateSchema</name>
		<value>true</value>
	</property>
	<property>
		<name>datanucleus.schema.autoCreateAll</name>
		<value>true</value>
	</property>
	<property>
		<name>datanucleus.autoCreateTables</name>
		<value>true</value>
	</property>
	<property>
		<name>datanucleus.autoCreateColumns</name>
		<value>true</value>
	</property>
	<property>
		<name>hive.metastore.local</name>
		<value>true</value>
	</property>
	<!-- 显示表的列名 -->
	<property>
		<name>hive.cli.print.header</name>
		<value>true</value>
	</property>
	<!-- 显示数据库名称 -->
	<property>
		<name>hive.cli.print.current.db</name>
		<value>true</value>
	</property>
</configuration>

  1. mysql安装:参照下面博客安装
    mysql的安装博客

3.mysql驱动下载并复制到hive的lib文件夹下,以下是下载链接

wget https://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.46.tar.gz
  1. 运行hive
[root@master ~]# hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/hive-3.1.2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
which: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/usr/java/jdk1.8.0_92/bin:/usr/java/jdk1.8.0_92/jre/bin:/usr/local/hadoop/bin:/sbin:/usr/local/hadoop/lib:/usr/local/hive/hive-3.1.2/bin:/root/
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/hive-3.1.2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = 3b065512-79d2-4e74-ac88-3e37cc8f6f68

Logging initialized using configuration in file:/usr/local/hive/hive-3.1.2/conf/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
Hive Session ID = f60b474e-db13-4224-b5b6-f7bbe9c3b563
hive (default)> show databases;
OK
database_name
default
Time taken: 0.634 seconds, Fetched: 1 row(s)
hive (default)> show databases;
OK
database_name
default
Time taken: 0.109 seconds, Fetched: 1 row(s)
hive (default)> 

  1. 遇到的异常,遇到的异常主要集中在运行hive的时候
    5.1:hadoop和hive中guava jar包版本冲突
    解决方案:
    在/usr/local/hadoop/share/hadoop/common/lib 中复制 guava-27.0-jre.jar
    替换 /usr/local/hive/hive-3.1.2/lib 中的 guava-19.0.jar
以下是异常信息:
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

5.2:hadoop没有启动导致hive链接失败
解决办法:启动hadoop集群

Exception in thread "main" java.lang.RuntimeException: java.net.ConnectException: Call From master.hadoop/192.168.140.138 to master.hadoop:9000 failed on connection exception: java.net.Connect

5.3:hive启动成功,但是showdatabases出现异常
原因:没有初始化数据库导致,执行名称初始化数据库即可
解决方法:在hive-3.1.2目录执行:schematool -dbType mysql -initSchema

[root@master ~]# hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/hive-3.1.2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
which: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/usr/java/jdk1.8.0_92/bin:/usr/java/jdk1.8.0_92/jre/bin:/usr/local/hadoop/bin:/sbin:/usr/local/hadoop/lib:/usr/local/hive/hive-3.1.2/bin:/root/n:/usr/java/jdk1.8.0_92/jre/bin:/usr/local/hadoop/bin:/sbin:/usr/local/hadoop/lib:/usr/local/hive/hive-3.1.2/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/hive-3.1.2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = a873f2ca-d735-4960-9e4c-48eea78119e9

Logging initialized using configuration in file:/usr/local/hive/hive-3.1.2/conf/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive (default)> 
              > ;
hive (default)> show databases;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

5.4:出现异常:
org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory
原因:hadoop处于安全模式,文件系统只读不接受删除修改
解决办法:修改状态

查看安全模式的状态:
 
hdfs dfsadmin -safemode get
 
打开安全模式
 
hdfs dfsadmin -safemode enter
 
关闭安全模式
 
hdfs dfsadmin -safemode leave
  • 2
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值