hive 1.2.1源码编译

  Hive 1.2.1源码编译依赖的Hadoop版本必须最少是2.6.0,因为里面用到了Hadoop的org.apache.hadoop.crypto.key.KeyProviderorg.apache.hadoop.crypto.key.KeyProviderFactory两个类,而这两个类在Hadoop 2.6.0才出现,否者会出现以下编译错误:

 
 
  1. [ERROR] /home/q/spark/apache-hive-1.2.1-src/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:[43,36] package org.apache.hadoop.crypto.key does not exis
  2. [ERROR] /home/q/spark/apache-hive-1.2.1-src/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:[41,36] package org.apache.hadoop.crypto.key does not exist
  3. [ERROR] /home/q/spark/apache-hive-1.2.1-src/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:[42,48] package org.apache.hadoop.crypto.key.KeyProvider does not exist

  Hive 1.2.1默认的Hadoop版本是2.6.0(2.6.0),不修改就行。

  Hive 1.2.1依赖的Spark版本最高只能是Spark 1.4.1,如果你修改到Spark 1.5.x版本编译的时候会出现以下错误:

 
 
  1. [ERROR] /home/q/spark/apache-hive-1.2.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[441,11] org.apache.hive.spark.client.RemoteDriver.ClientListener is not abstract and does not override abstract method onBlockUpdated(org.apache.spark.scheduler.SparkListenerBlockUpdated) in org.apache.spark.scheduler.SparkListener

spark 1.2.1默认的Spark版本是1.3.1,1.3.1。如果没有特殊情况,不修改pom.xml文件即可一次性编译成功。

编译的时候需要按照配置好Maven,如果在编译时出现类似的内存不够问题,请设置如下配置

export MAVEN_OPTS= "-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"

上面弄完之后,我们可以用下面命令编译:

mvn clean package -Phadoop-2 -DskipTests

如果一切顺利的话,可以得到以下的编译结果:

[INFO] Reactor Summary:
[INFO]
[INFO] Hive ............................................... SUCCESS [  3.227 s]
[INFO] Hive Shims Common .................................. SUCCESS [  1.923 s]
[INFO] Hive Shims 0.20S ................................... SUCCESS [  0.797 s]
[INFO] Hive Shims 0.23 .................................... SUCCESS [  3.250 s]
[INFO] Hive Shims Scheduler ............................... SUCCESS [  0.967 s]
[INFO] Hive Shims ......................................... SUCCESS [  0.916 s]
[INFO] Hive Common ........................................ SUCCESS [  2.660 s]
[INFO] Hive Serde ......................................... SUCCESS [  1.846 s]
[INFO] Hive Metastore ..................................... SUCCESS [  3.906 s]
[INFO] Hive Ant Utilities ................................. SUCCESS [  0.320 s]
[INFO] Spark Remote Client ................................ SUCCESS [  5.191 s]
[INFO] Hive Query Language ................................ SUCCESS [ 25.649 s]
[INFO] Hive Service ....................................... SUCCESS [  1.492 s]
[INFO] Hive Accumulo Handler .............................. SUCCESS [  2.095 s]
[INFO] Hive JDBC .......................................... SUCCESS [  6.638 s]
[INFO] Hive Beeline ....................................... SUCCESS [  0.865 s]
[INFO] Hive CLI ........................................... SUCCESS [  0.928 s]
[INFO] Hive Contrib ....................................... SUCCESS [  0.725 s]
[INFO] Hive HBase Handler ................................. SUCCESS [  2.840 s]
[INFO] Hive HCatalog ...................................... SUCCESS [  0.489 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [  1.226 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [  1.003 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [  0.847 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [  0.820 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [  3.950 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [  0.803 s]
[INFO] Hive HWI ........................................... SUCCESS [  0.709 s]
[INFO] Hive ODBC .......................................... SUCCESS [  0.584 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [  0.060 s]
[INFO] Hive TestUtils ..................................... SUCCESS [  0.090 s]
[INFO] Hive Packaging ..................................... SUCCESS [  1.220 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time : 11:19 min
[INFO] Finished at: 2015-11-09T19:33:33+08:00
[INFO] Final Memory: 99M /881M
[INFO] ------------------------------------------------------------------------

编译时长根据你网络环境,最好配置成公司的Maven仓库。当然,如果你需要将编译好的文件进行打包,可以运行下面的命令:

mvn clean package -Phadoop-2 -DskipTests -Pdist

上面的命令运行完成之后,会在$HIVE_SRC/packaging/target目录下产生以下几个文件:

drwxr-xr-x 2 iteblog iteblog     4096 May  9 13:13 antrun
drwxr-xr-x 3 iteblog iteblog     4096 May  9 13:14 apache-hive-1.2.1-bin
-rw-r--r-- 1 iteblog iteblog 92836488 May  9 13:14 apache-hive-1.2.1-bin. tar .gz
-rw-r--r-- 1 iteblog iteblog 17360091 May  9 13:14 apache-hive-1.2.1-jdbc.jar
-rw-r--r-- 1 iteblog iteblog 14318686 May  9 13:14 apache-hive-1.2.1-src. tar .gz
drwxr-xr-x 4 iteblog iteblog     4096 May  9 13:14 archive-tmp
drwxr-xr-x 3 iteblog iteblog     4096 May  9 13:13 maven-shared-archive-resources
drwxr-xr-x 3 iteblog iteblog     4096 May  9 13:13 tmp
drwxr-xr-x 2 iteblog iteblog     4096 May  9 13:13 warehouse


转自:https://www.iteblog.com/archives/1527.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值