hive1.2.1安装

准备:hadoop-2.7.3伪分布式环境

    linux centos6

    hive-1.2.1.tar.gz包

安装:1.解压:tar zxvf apache-hive-1.2.1-bin.tar.gz 解压到当前目录

    2.hive的环境变量etc/profile文件,可配可不配

    3.修改hive根目录下的conf文件夹下的两个模板文件

cp hive-env.sh.template hive-env.sh

cp hive-default.xml.template hive-site.xml

  hive-env.sh文件中修改的部分如下:

# HADOOP_HOME=${bin}/../../hadoop
HADOOP_HOME=/root/hadoop-2.7.3
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/home/root/apache-hive-1.2.1-bin/conf

# Folder containing extra ibraries required for hive compilation/execution can be controlled by:
export HIVE_AUX_JARS_PATH=/home/root/apache-hive-1.2.1-bin/lib
hive-site.xml文件中
修改的部分如下:

<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/home/root/apache-hive-1.2.1-bin/warehouse</value>
    <description>location of default database for the warehouse</description>
  </property>

<property>
    <name>hive.exec.scratchdir</name>
    <value>/home/root/apache-hive-1.2.1-bin/tmp</value>
    <description>HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/<username> is created, with ${hive.scratch.dir.permission}.</description>
  </property>

<property>
    <name>hive.querylog.location</name>
    <value>/home/root/apache-hive-1.2.1-bin/log</value>
    <description>Location of Hive run time structured log file</description>
  </property>
当然,这三个文件夹要事先建好。

我把默认的derby数据库改为了本机上已经安装好的mysql数据库:

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:mysql://127.0.0.1:3306/hive?createDatabaseIfNotExist=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
   <name>javax.jdo.option.ConnectionUserName</name>
   <value>root</value>
   <description>username to use against metastore database</description>
</property>
<property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>root</value>
    <description>password to use against metastore database</description>
</property>
还要记得把 mysql-connector-java-5.1.6.jar 拷贝到hive根目录的lib目录下

此时,进入到hive根目录的bin目录下执行hive 启动。。。

发现出错:说我hadoop version 有问题,找不到main函数的启动类VersionInfo类,那肯定就是环境变量的问题,看了眼环境变量,把多余的失效的hadoop环境变量配置给去掉就ok了。此时,再次启动,发现依然有错误。。错误如下:

Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
	at org.apache.hadoop.fs.Path.initialize(Path.java:205)
	at org.apache.hadoop.fs.Path.<init>(Path.java:171)
	at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:563)
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
	... 8 more
Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
	at java.net.URI.checkPath(URI.java:1823)
	at java.net.URI.<init>(URI.java:745)
	at org.apache.hadoop.fs.Path.initialize(Path.java:202)
	... 11 more
说明hive找不到这个叫 ${system:java.io.tmpdir}的变量,此时我在hive根目录下建立了一个文件夹,名字就叫system:java.io.tmpdir,然后把conf文件夹下面的hive-site.xml中所有${system:java.io.tmpdir}都改为了绝对路径,这样这个问题就解决了,再次启动,然后就没有问题了。。

最后启动web界面的服务:bin/hive --service hwi

发现启动不起来,报错,错误如下:FATAL hwi.HWIServer: HWI WAR file not found at ${env:HWI_WAR_FILE} 在这个环境变量配置下面找不到war包,此时依然去hive-site.xml文件中去找到跟hwi相关的配置,然后找遍整个目录确实没有跟hwi相关的war包,那么只能去官网下载对应版本1.2.1的源码然后自行打包:

tar zxvf apache-hive-1.2.1-src.tar.gz

cd apache-hive-1.2.1-src/hwi

jar cfM hive-hwi-1.2.1.war -C web ./

将打包好的war包拷贝到hive的lib目录下cp hive-hwi-1.2.1.war /home/root/apache-hive-1.2.1-bin/lib/

此时再去hive-site.xml文件中将对应的war包文件的配置路径改为

<property>
    <name>hive.hwi.war.file</name>
    <value>/lib/hive-hwi-1.2.1.war</value>
    <description>This sets the path to the HWI war file, relative to ${HIVE_HOME}. </description>
  </property>

然后再重启。。终于启动成功。。(后台启动hive --service hwi 2>/tmp/hwi2.log &)BUT,访问web页面http://IP:9999/hwi/发现:

HTTP ERROR 500
Problem accessing /hwi/index.jsp. Reason:
    JSP support not configured
Powered by Jetty://
不能解析jsp页面,hive web的启动日志中也发现了: did not find org.apache.jasper.servlet.JspServlet的日志信息输出,那么就去找jasper的相关jar包,最后加了3个jar包才管用:jasper.jar 和 jasper-el.jar 还有tomcat-juli.jar

然后再重启发现终于没有问题了,出现了下面清爽的界面:



一波三折,以后再遇到什么问题后续会再次添加。。。

参考文章:http://blog.csdn.net/gamer_gyt/article/details/47150621

http://www.cnblogs.com/hbwxcw/p/5960551.html

http://blog.csdn.net/ggz631047367/article/details/49979109

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值