1.安装mysql:yum源安装
1.1安装MySQL镜像
先查看最新镜像:
https://dev.mysql.com/downloads/repo/yum/
1.2下载
wget http://dev.mysql.com/get/mysql57-community-release-el7-11.noarch.rpm
1.3安装mysql镜像
yum install mysql57-community-release-el7-11.noarch.rpm
1.4安装mysql服务,安装时间很长(慢慢等待)
yum install mysql-community-server
1.5安装结束
systemctl status mysqld.service
1.6启动mysql
systemctl start mysqld.service
1.7查看mysql初始化密码
grep "password" /var/log/mysqld.log
1.8root登入mysql
mysql -uroot -p
注意:密码是刚才查询出来的
1.9更改root密码
ALTER USER 'root'@'localhost' IDENTIFIED BY 'Root@123456';
注意:这一步必不可少:大小写、特殊字符、8位以上
1.10更改远程访问
GRANT ALL PRIVILEGES ON *.* TO 'root'@'%' IDENTIFIED BY 'Root@123456' WITH GRANT OPTION;
再执行
FLUSH PRIVILEGES;
1.11开机启动
systemctl enable mysqld
systemctl daemon-reload
1.12更改mysql的密码策略
#查看密码策略
show variables like '%validate_password_policy%';
show variables like '%validate_password_length%';
#修改密码策略
set global validate_password_policy=0;
set global validate_password_length=1;
1.13新建hive用户
create user 'hive'@'%' identified by 'hive';
顺便更改root密码
#远程
GRANT ALL PRIVILEGES ON *.* TO 'root'@'%' IDENTIFIED BY 'root' WITH GRANT OPTION;
和
#本地
GRANT ALL PRIVILEGES ON *.* TO 'root'@'localhost' IDENTIFIED BY 'root' WITH GRANT OPTION;
FLUSH PRIVILEGES;
1.14新建hive的db
root用户登入
#给hive用的数据库
create database db_hive;
授权给mysql用户hive
GRANT ALL ON *.* TO 'hive'@'%';
flush privileges;
2.安装hive
2.1下载并解压hive
sudo tar -zxvf apache-hive-3.1.2-bin.tar.gz -C /usr/local/
重命名
mv apache-hive-3.1.2-bin/ hive
2.2环境变量
#hive
export HIVE_HOME=/usr/local/hive/
export PATH=$PATH:$HIVE_HOME/bin
执行:
source /etc/profile.d/hadoop03.sh
echo $HIVE_HOME
有打印结果说明环境变量生效了
2.3查看版本
hive --version
2.4更改配置
进入hive的conf目录如:cd /usr/local/hive/conf
vim hive-site.xml
增加配置
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/db_hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
</configuration>
hadoop01可以改为localhost
注意更改数据:用户名、密码、库名
2.5下载msql连接的jar包
进入hive的lib文件夹下边
wget http://central.maven.org/maven2/mysql/mysql-connector-java/5.1.47/mysql-connector-java-5.1.47.jar
2.6初始化hive的数据
进入hive的bin目录如:cd /usr/local/hive/bin
执行:
schematool -dbType mysql -initSchema
如果报错
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5104)
at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
则是guava.jar的问题hadoop和hive的版本不一致,删除低版本拷贝高版本
hive:guava-19.0.jar
hadoop:guava-27.0-jre.jar
(cd /usr/local/hadoop/share/hadoop/common/lib)
拷贝高版本到hive:
cp guava-27.0-jre.jar /usr/local/hive/lib/
#备份
mv guava-19.0.jar guava-19.0.jar.backup
再次执行:schematool -dbType mysql -initSchema
输出:
2.7查看初始化后的数据
use db_hive;
show tables;
如下:
2.8进入hive终端
输入hive+回车