Hive 安装
1、 安装Hive基于hadoop、mysql已安装的基础上
2、 解压安装包tar -zxvf hive-1.1.0-cdh5.8.0.tar.gz
3、 添加环境变量 vi ~/.bash_profile
导入下面的环境变量
export HIVE_HOME=/root/hive-1.1.0-cdh5.8.0
export PATH=$PATH:$HIVE_HOME/bin使其有效
source .bash_profile
5、touch hive-1.1.0-cdh5.8.0/conf/hive-site.xml
创建hive-site.xml
拷贝mysql-connector-java-5.1.6-bin.jar 到hive 的lib下面
mv /root/mysql-connector-java-5.1.6-bin.jar/root/hive-1.1.0-cdh5.8.0/lib/
把jline-2.12.jar拷贝到hadoop相应的目录下,替代jline-0.9.94.jar,否则启动会报错
cp /root/jline-2.12.jar /root/hadoop-2.6.0-cdh5.8.0/yarn/lib/
6、修改hive-site.xml
<configuration>
<property>
<name>hive.exec.scratchdir</name>
<value>/tmp/hivedir</value>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/user/hive/tmp</value>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/user/hive/tmp</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://192.168.0.134:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>hive.querylog.location</name>
<value>/root/hive/querylog</value>
</property>
<property>
<name>hive.hwi.war.file</name>
<value>lib/hive-hwi-1.1.0.jar</value>
</property>
<property>
<name>hive.server2.logging.operation.log.location</name>
<value>/root/hive/operationlogs</value>
</property>
<property>
<name>hive.server2.transport.mode</name>
<value>binary</value>
</property>
<property>
<name>hive.server2.thrift.bind.host</name>
<value>192.168.0.118</value>
</property>
<property>
<name>hive.server2.thrift.http.port</name>
<value>10002</value>
</property>
<property>
<name>hive.security.authorization.enabled</name>
<value>flase</value>
</property>
<property>
<name>hive.server2.authentication.spnego.principal</name>
<value>ALL</value>
</property>
<property>
<name>hive.server2.webui.host</name>
<value>192.168.0.118</value>
</property>
<property>
<name>hive.server2.webui.port</name>
<value>10003</value>
</property>
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
<description>Port number of HiveServer2 Thrift interface whenhive.server2.transport.mode is 'binary'.</description>
</property>
<property>
<name>oozie.credentials.credentialclasses</name>
<value>hive2=org.apache.oozie.action.hadoop.Hive2Credentials</value>
</property>
</configuration>(红色标注部分根据实际要求修改)
7、修改hive-env.sh
添加export HADOOP_HOME=/root/hadoop-2.6.0-cdh5.8.0(根据实际修改)
8、 hdfs dfs -mkdir -p /user/hive/warehouse
hdfs dfs -mkdir -p /user/hive/tmp
hdfs dfs -mkdir -p /root/hive/operationlogs
hdfs dfs -chmod 777 /user/hive/warehouse
hdfs dfs -chmod 777 /user/hive/tmp
hdfs dfs -chmod 777 /root/hive/operationlogs
hdfs dfs -mkdir -p /tmp/hivedir
hdfs dfs -chmod 777 /tmp/hivedir
9、启动
启动hiveserver2
# hive --service metastore &
# hive --service hiveserver2 &