环境
操作系统:linux centos7
jdk版本:jdk1.8.0_141
jdk下载地址:
hadoop版本:hadoop-2.7.6
hadoop下载地址
hive版本:apache-hive-2.0.0
hive下载地址:
mysql版本:5.5.49
mysql下载地址:
hive on mapreduce部署
1.解压hive压缩包,并配置环境变量
解压hive压缩包:
tar –zxvf apache-hive-2.0.0-bin.tar.gz
添加hive环境变量:
vi /etc/profile
export HIVE_HOME=/usr/tools/apache-hive-2.0.0-bin
export PATH=$PATH:$HIVE_HOME/bin
source /etc/profile
2.安装mysql5.5.49
3.将mysql的驱动jar包mysql-connector-java-5.1.7-bin.jar拷入hive的lib目录下面
4.配置hive
a.在hdfs上面创建新的目录/user/hive/warehouse
hdfs dfs -mkdir /tmp
hdfs dfs -mkdir /user
hdfs dfs -mkdir /user/hive
hdfs dfs -mkdir /user/hive/warehouse
hadoop fs -chmod g+w /tmp
hadoop fs -chmod g+w /user/hive/warehouse
b.配置hive-site.xml文件
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<!--mysql连接url -->
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://127.0.0.1:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<!--mysql连接驱动-->
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<!--mysql连接用户名-->
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<!--mysql连接密码-->
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
</property>
<!--存放hive临时文件-->
<property>
<name>hive.exec.local.scratchdir</name>
<value>/opt/software/apache-hive-1.2.2-bin/tmp</value>
</property>
<!--存放hive下载的本地临时文件-->
<property>
<name>hive.downloaded.resources.dir</name>
<value>/opt/software/apache-hive-1.2.2-bin/tmp/resources</value>
</property>
<!--存放hive日志文件-->
<property>
<name>hive.querylog.location</name>
<value>/opt/software/apache-hive-1.2.2-bin/tmp</value>
</property>
<!--存放hiveserver2的日志文件-->
<property>
<name>hive.server2.logging.operation.log.location</name>
<value>/opt/software/apache-hive-1.2.2-bin/tmp/operation_logs</value>
</property>
</configuration>
5.初始化元数据
schematool -initSchema -dbType mysql
6.测试是否安装成功
7.配置hiveServer2(hive的远程登录)
a.将以下内容添加到hadoop的conf目录下面的core-site.xml配置文件中
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
b.开启hiveServer2
命令行输入:hiverserver2
c.使用beeline连接hive
beeline
!connect jdbc:hive2://192.168.100.101:10000 root
输入密码,即可使用hive