目录
Q1:如果执行schematool -dbType mysql -initSchema报以下错误,在这因为本地之前已经有名为hive的数据库了,删除重新执行命令就好。
一、安装
#下载
$ wget https://mirrors.tuna.tsinghua.edu.cn/apache/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz
#解压,路径为/hadoop/hive-3.1.2/conf
$ tar -zxvf apache-hive-3.1.2-bin.tar.gz
$ mv aapache-hive-3.1.2-bin hive-3.1.2
#下载mysql-connector-java-5.1.48.jar
wget https://repo1.maven.org/maven2/mysql/mysql-connector-java/5.1.48/mysql-connector-java-5.1.48.jar
#复制到hive lib目录下
cp mysql-connector-java-5.1.48.jar /Users/zheng/hive-3.1.2/lib
#设置环境变量
vim /etc/profile
#新增
export HIVE_HOME=/hadoop/hive-3.1.2
export PATH=$PATH:$HIVE_HOME/bin
#检查环境变量是否配置成功
hive version
执行hive version报错:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/zheng/hadoop/hive-3.1.2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/zheng/hadoop/hadoop-3.2.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5099)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:97)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:81)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
原因:本地安装了hadoop,/hadoop/hadoop-3.2.1/share/hadoop/common/lib/guava-27.0-jre.jar和/hadoop/hive-3.1.2/lib/guava-19.0.jar版本不一致,删除低版本/hadoop/hive-3.1.2/lib/guava-19.0.jar,拷贝guava-27.0-jre.jar到/hadoop/hive-3.1.2/lib目录下。注意:必须使用高版本,使用低版本还是会报错
二、修改配置
cd /hadoop/hive-3.1.2/conf
cp hive-env.sh.template hive-env.sh
cp hive-default.xml.template hive-site.xml
#修改hive-site.xml文件内以下参数,configuration内的参数全部删除,只留下以下参数并根据本地mysql数据库设置相应用户名密码
vim hive-site.xml
<!-- mysql-connector-java5以前的版本为com.mysql.jdbc.Driver,6以后的版本为com.mysql.cj.jdbc.Driver -->
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true&characterEncoding=utf8&useSSL=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>yourpassword</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/Users/zheng/hive/warehouse</value>
</property>
三、初始化metastore
#初始化metastore
$ cd /hadoop/hive/hive-3.1.2/bin
$ schematool -dbType mysql -initSchema
执行schematool -dbType mysql -initSchema报命令找不到,回看了下发现是hive-site.xml里的密码写错了,执行成功后会看到以下数据库
Q1:如果执行schematool -dbType mysql -initSchema报以下错误,在这因为本地之前已经有名为hive的数据库了,删除重新执行命令就好。
Q2:如果删除hive库报Error dropping database in SQL (can't rmdir './hive', errno: 66)错误,执行 sudo find / -name hive搜索下mysqlhive库文件所在位置,手动删除即可
确认初始化成功
#进入hive
$ hive
#启动如果要看日志,用以下命令
hive --hiveconf hive.root.logger=DEBUG,console
#查看数据库
hive> show databases;
OK
default
Time taken: 1.011 seconds, Fetched: 1 row(s)
hive>
网上看到有些在hive-env.sh设置了环境变量之类的,这里由于已经在本地的/etc/profile通过全局的方式设置过了,故不需要再设置。