1.系统环境:
Hadoop环境已安装
ubuntu系统
2.安装步骤
2.1:安装mysql
sudo apt-get install mysql-server
登录mysql
默认账号root 密码root
mysql -u+账号 -p+密码
在mysql中创建用户名为hive密码为hive的用户:
create user 'hive' identified by 'hive';
给hive用户赋予操作权限:(如果不赋予外部就无法访问)
grant all privileges on *.* to 'mysql用户名'@'用户名' identified by '数据库密码';
flush privileges;使服务生效
创建完成后进入hive用户+创建hive数据库:
mysql> create database hive;
2.2:安装hive(记住在选择hive版本的时候考虑和自己所装hadoop版本的差距)
cd /usr/local/
sudo tar -zxvf apache-hive-2.3.6-bin.tar.gz(解压安装包)
mv apache-hive-2.3.6-bin hive (改名成hive)
sudo chown -R root ./hive/ (给予权限)
环境配置:
sudo vim /etc/bash.bashrc
添加一下代码
export HIVE_HOME=/usr/local/hive
export HCAT_HOME=$HIVE_HOME/hcatalog
export HIVE_CONF=$HIVE_HOME/conf
export PATH=$PATH:$HIVE_HOME/bin
保存,然后使Hive环境变量生效,操作命令如下:
source /etc/bash.bashrc
进入/usr/local/hive/conf文件夹
新建hive-site.xml文件(建议选择copy)
cp hive-default.xml.template hive-site.xml
然后修改以下变量
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hivecharacterEncoding=UTF8&createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
连接mysql时用到的用户名
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
mysql用户的密码
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
</property>
将hvie-site.xml中的system:java.io.tmpdir全部替换为/usr/local/hive/tmp,将system:java.io.tmpdir全部替换为/usr/local/hive/tmp,将{system:user.name}全部替换为${user.name}。
ps:记得创建tmp文件夹
在/usr/local/hive/bin文件夹下运行
schematool -dbType mysql -initSchema(初始化mysql数据库)
如果出现报错SLF4J
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/datafs/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/datafs/hadoop/hadoop-3.1.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
log4j-slf4j-impl-2.10.0.jar
这个 jar包没有干掉
解决方法:
cd /usr/local/hive/lib
mv log4j-slf4j-impl-2.10.0.jar log4j-slf4j-impl-2.10.0.jar.bak
报错:
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5099)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:97)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:81)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
解决方法:
com.google.common.base.Preconditions.checkArgument 这是因为hive内依赖的guava.jar和hadoop内的版本不一致造成的。
查看hadoop安装目录下/usr/local/hadoop/share/hadoop/common/lib内guava.jar版本
查看hive安装目录下lib内guava.jar的版本 如果两者不一致,删除版本低的,并拷贝高版本的
然后在hive/bin下运行hive如果没有报错你就安装成功了