Hive1.1.0版本源码编译

概述

通过源码方式maven编译获取(本实验使用hive-1.1.0-cdh5.7.0源码版本)

1.安装前准备

1.1 下载hive cdh5.7.0源码编译软件包与依赖包

下载地址

[root@hadoop ~]# cd /tmp/
[root@hadoop tmp]# wget http://archive.cloudera.com/cdh5/cdh/5/hive-1.1.0-cdh5.7.0-src.tar.gz
1.2 Oracle jdk1.8安装部署(Open jdk尽量不要使用)(略)
1.3 设置java全局环境变量(略)
1.4 maven安装部署设置全局环境变量(版本3.3.9)(略)

2. hive-1.1.0-cdh5.7.0源码编译

2.1 解压hive源码编译

参考地址

[root@hadoop tmp]# tar xf hive-1.1.0-cdh5.7.0-src.tar.gz
[root@hadoop tmp]# cd hive-1.1.0-cdh5.7.0
[root@hadoop hive-1.1.0-cdh5.7.0]# 
2.2 其他依赖
[root@hadoop ~]# yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
[root@hadoop ~]# yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
[root@hadoop ~]# yum install -y ant patch
2.3 hive编译参考文档

Compile Hive on branch-1

In branch-1, Hive supports both Hadoop 1.x and 2.x. You will need to specify which version of Hadoop to build against via a Maven profile. To build against Hadoop 1.x use the profile hadoop-1; for Hadoop 2.x use hadoop-2. For example to build against Hadoop 1.x, the above mvn command becomes: $ mvn clean package -Phadoop-1,dist

在分支1上编译Hive

在分支1中,Hive同时支持Hadoop 1.x和2.x. 您将需要通过Maven配置文件指定要构建哪个版本的Hadoop。 为了构建Hadoop 1.x,使用配置文件hadoop-1; 对于Hadoop 2.x,使用hadoop-2。 例如,要构建反对Hadoop 1.x,上面的mvn命令变成: $ mvn clean package -Phadoop-1,dist

####当前环境需要支持Hadoop2.x版本编译参数如下

[root@hadoop hive-1.1.0-cdh5.7.0]# screen -S hive-complie
[root@hadoop hive-1.1.0-cdh5.7.0]# source /etc/profile
[root@hadoop hive-1.1.0-cdh5.7.0]# mvn -Phadoop-2 -Pdist -DskipTests -Dmaven.javadoc.skip=true clean package

Ctrl+a+b 暂时断开screen会话

[INFO] Building tar: /tmp/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-bin.tar.gz
[INFO] Building tar: /tmp/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-src.tar.gz
[INFO] 
[INFO] --- maven-dependency-plugin:2.8:copy (copy) @ hive-packaging ---
[INFO] Configured Artifact: org.apache.hive:hive-jdbc:standalone:1.1.0-cdh5.7.0:jar
[INFO] Copying hive-jdbc-1.1.0-cdh5.7.0-standalone.jar to /tmp/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-jdbc.jar
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:attach-artifact (attach-jdbc-driver) @ hive-packaging ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive ............................................... SUCCESS [  3.164 s]
[INFO] Hive Shims Common .................................. SUCCESS [  7.302 s]
[INFO] Hive Shims 0.23 .................................... SUCCESS [  2.942 s]
[INFO] Hive Shims Scheduler ............................... SUCCESS [  1.864 s]
[INFO] Hive Shims ......................................... SUCCESS [  0.952 s]
[INFO] Hive Common ........................................ SUCCESS [  5.773 s]
[INFO] Hive Serde ......................................... SUCCESS [  5.392 s]
[INFO] Hive Metastore ..................................... SUCCESS [ 15.928 s]
[INFO] Hive Ant Utilities ................................. SUCCESS [  1.234 s]
[INFO] Spark Remote Client ................................ SUCCESS [  5.519 s]
[INFO] Hive Query Language ................................ SUCCESS [ 51.445 s]
[INFO] Hive Service ....................................... SUCCESS [ 10.060 s]
[INFO] Hive Accumulo Handler .............................. SUCCESS [  4.404 s]
[INFO] Hive JDBC .......................................... SUCCESS [01:17 min]
[INFO] Hive Beeline ....................................... SUCCESS [  2.482 s]
[INFO] Hive CLI ........................................... SUCCESS [  2.929 s]
[INFO] Hive Contrib ....................................... SUCCESS [  3.179 s]
[INFO] Hive HBase Handler ................................. SUCCESS [  5.205 s]
[INFO] Hive HCatalog ...................................... SUCCESS [  0.596 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [  4.844 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [  2.859 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [  3.452 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [  3.363 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [  5.720 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [  3.446 s]
[INFO] Hive HWI ........................................... SUCCESS [  2.397 s]
[INFO] Hive ODBC .......................................... SUCCESS [  2.972 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [  0.619 s]
[INFO] Hive TestUtils ..................................... SUCCESS [  0.611 s]
[INFO] Hive Packaging ..................................... SUCCESS [ 48.552 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:47 min
[INFO] Finished at: 2017-12-30T10:37:18+08:00
[INFO] Final Memory: 206M/945M
[INFO] ------------------------------------------------------------------------


####编译成功tar包
[root@hadoop hive-1.1.0-cdh5.7.0]# cd packaging/target/
[root@hadoop target]# ls apache-hive-1.1.0-cdh5.7.0-bin.tar.gz
apache-hive-1.1.0-cdh5.7.0-bin.tar.gz
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值