1.下载安装包
http://apache.fayea.com/apache-mirror/hive/stable/apache-hive-1.1.0-bin.tar.gz
2.安装(1)上传包--上传hive包到/home/hadoop上
(2)解压 tar–zxvf apache-hive-1.1.0-bin.tar.gz(或者直接在ubuntu图形界面上解压安装包)
(3)重命名包 mv apache-hive-1.1.0-bin hive-1.1.0 (mv nowname rename)
3.配置hive
(1)配置环境变量
$sudo gedit /etc/profile
在配置文件上CLASSPATH,PATH的基础上增加hive的路径,还有加上HIVE_HOME的路径。
export HIVE_HOME=/home/hadoop/hive-1.1.0
export PATH=$HIVE_HOME/bin:$PATH
export CLASSPATH=$CLASSPATH:$HIVE_HOME/lib
(2)配置hive-env.sh
进入配置目录:$cd /home/hadoop/hive-1.1.0/conf
打开配置文件:$gedit hive-env.sh
具体配置:
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/home/hadoop/hadoop-2.6.0
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/home/hadoop/hive-1.1.0/conf
(3)配置hive-site.xml文件
创建配置文件:$cp hive-default.xml.template hive-site.xml
修改hive-site.xml文件
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<configuration>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/home/hadoop/hive/warehouse</value>
<description>location of default database for the warehouse</description>
</property>
<property>
<name>hive.exec.scratchdir</name>
<value>/home/hadoop/hive/scratchdir</value>
<description>Scratch space for Hive jobs</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>/home/hadoop/hive-1.1.0/logs</value>
<description>
Location of Hive run time structured log file
</description>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://master:3306/hive_metadata?createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>root123</value>
<description>password to use against metastore database</description>
</property>
</configuration>
(4)在hive-1.1.0目录下创建local目录
$mkdir local
(5)配置log4j
在hive-1.1.0目录下:
创建配置文件:
cp hive-exec-log4j.properties.template hive-exec-log4j.properties
cp hive-log4j.properties.template hive-log4j.properties
修改上面两个文件中的配置
hive.log.dir=/home/hadoop/ hive-1.1.0/logs
log4j.appender.EventCounter=org.apache.hadoop.log.metrics.EventCounter
注意如果没有logs目录就建立一个 执行命令
$mkdir /home/hadoop/hive-1.1.0/logs
(6)添加Mysql驱动包
下载驱动包:mysql-connector-java-5.1.9.jar
添加驱动包:把驱动包放在$HIVE_HOME/lib目录下
修改hadoop的库文件:
在$HADOOP_HOME/share/hadoop/yarn/lib下备份jline-0.9.94.jar
执行命令
$mv jline-0.9.94.jar jline-0.9.94.jar.bak
Copy高版本的jline
$cp $HIVE_HOME/lib/jline-2.12.jar $HADOOP_HOME /share/hadoop/yarn/lib
4.配置完毕,启动hive
$source /etc/profile 让配置文件生效
$hive