在安装hive之前需要安装hadoop,hadoop安装地址传送门
安装包准备
下载的是hive-3.1.2,传送门
下载mysql-connector-java-5.1.46.jar,传送门
安装部署
需要将下载的压缩文件进行压缩,使用tar -zxvf apache-hive-3.1.2-bin.tar.gz -C /export/servers/
命令将文件解压到指定文件。
1、环境添加
将hive的文件路径加入到/etc/profile
,
export JAVA_HOME=/export/servers/jdk1.8.0_261
export HADOOP_HOME=/export/servers/hadoop
export HIVE_HOME=/export/servers/apache-hive-3.1.2
export HIVE_CONF_DIR=/export/servers/apache-hive-3.1.2/conf
添加完毕后,执行source /etc/profile
2、添加新的包
将mysql-connector-java-5.1.46.jar文件发送到/export/servers/apache-hive-3.1.2-bin/lib
3、修改hive-env.sh
将hive-env.sh.template修改为hive-env.sh,使用mv hive-env.sh.template hive-env.sh
,在hive-env.sh添加下面内容
export JAVA_HOME=/export/servers/jdk1.8.0_261
export HADOOP_HOME=/export/servers/hadoop
export HIVE_HOME=/export/servers/apache-hive-3.1.2-bin
export HIVE_CONF_DIR=/export/servers/apache-hive-3.1.2-bin/conf
4、添加hive-site.xml
在conf文件夹下添加新的hive-site.xml文件,在文件内容中添加以下内容
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>111111</value>
<description>password to use against metastore database</description>
</property>
</configuration>
测试数据库
执行../bin/schematool -dbType mysql -initSchema
命令,会以下错误
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5099)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:97)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:81)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
这主要是因为hadoop/share/hadoop/common/lib
目录下的guava和/apache-hive-3.1.2-bin/lib
目录下的guava版本不同。需要将版本将hadoop高版本的guava拷贝到hive的目录下,删除hive低的版本。
重新执行../bin/schematool -dbType mysql -initSchema
,就可以初始化成功
解决hive启动出现WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+,
问题
在hive-site.xml
中的
jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true
改为
jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true&useSSL=false
,重新启动后,那么长的就消失了。
补知识
Unexpected character ‘=’ (code 61); expected a semi-colon after the reference for entity 'useUnicode
Xml文件中不能使用&,要使用他的转义&来代替。