Hadoop基础教程-第13章 源码编译(13.4 Hive2.1.1源码编译)

第13章 源码编译

13.4 Hive2.1.1源码编译


13.4.1 下载源码

https://mirrors.tuna.tsinghua.edu.cn/apache/hive/

单击stable-2

下载源码apache-hive-2.1.1-src.tar.gz

13.4.2 环境准备

Hive2.1.1的源码包需要JDK1.8。如果使用JDK1.7编译是总是产生各种奇怪的问题,GC overhead limit exceeded

使用Ali镜像时出现问题

The POM for org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde is missing, no dependency information available

于是,不启用Ali镜像

[root@cyq apache-hive-2.1.1-src]# vi /opt/apache-maven-3.3.9/conf/settings.xml
  <mirrors>
    <!-- mirror
     | Specifies a repository mirror site to use instead of a given repository. The repository that
     | this mirror serves has an ID that matches the mirrorOf element of this mirror. IDs are used
     | for inheritance and direct lookup purposes, and must be unique across the set of mirrors.
     |
    <mirror>
      <id>mirrorId</id>
      <mirrorOf>repositoryId</mirrorOf>
      <name>Human Readable Name for this Mirror.</name>
      <url>http://my.repository.com/repo/path</url>
    </mirror>
     -->
       <!--
    <mirror>
        <id>nexus-aliyun</id>
        <mirrorOf>*</mirrorOf>
        <name>Nexus aliyun</name>
        <url>http://maven.aliyun.com/nexus/content/groups/public</url>
    </mirror>
       -->
  </mirrors>

13.4.3 开始编译

[root@cyq apache-hive-2.1.1-src]# export MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=1024m"
[root@cyq apache-hive-2.1.1-src]# mvn clean package -Phadoop-2 -Pdist -DskipTests -Dtar

[INFO] Reactor Summary:
[INFO] 
[INFO] Hive ............................................... SUCCESS [  0.877 s]
[INFO] Hive Shims Common .................................. SUCCESS [  1.735 s]
[INFO] Hive Shims 0.23 .................................... SUCCESS [  1.321 s]
[INFO] Hive Shims Scheduler ............................... SUCCESS [  0.509 s]
[INFO] Hive Shims ......................................... SUCCESS [  0.415 s]
[INFO] Hive Storage API ................................... SUCCESS [  0.536 s]
[INFO] Hive ORC ........................................... SUCCESS [  1.636 s]
[INFO] Hive Common ........................................ SUCCESS [  2.218 s]
[INFO] Hive Service RPC ................................... SUCCESS [  1.044 s]
[INFO] Hive Serde ......................................... SUCCESS [  1.632 s]
[INFO] Hive Metastore ..................................... SUCCESS [  7.775 s]
[INFO] Hive Ant Utilities ................................. SUCCESS [  0.131 s]
[INFO] Hive Llap Common ................................... SUCCESS [  1.101 s]
[INFO] Hive Llap Client ................................... SUCCESS [  0.708 s]
[INFO] Hive Llap Tez ...................................... SUCCESS [  0.886 s]
[INFO] Spark Remote Client ................................ SUCCESS [  1.677 s]
[INFO] Hive Query Language ................................ SUCCESS [ 36.962 s]
[INFO] Hive Llap Server ................................... SUCCESS [ 38.075 s]
[INFO] Hive Service ....................................... SUCCESS [01:26 min]
[INFO] Hive Accumulo Handler .............................. SUCCESS [ 31.816 s]
[INFO] Hive JDBC .......................................... SUCCESS [01:37 min]
[INFO] Hive Beeline ....................................... SUCCESS [ 15.614 s]
[INFO] Hive CLI ........................................... SUCCESS [  0.772 s]
[INFO] Hive Contrib ....................................... SUCCESS [  0.536 s]
[INFO] Hive HBase Handler ................................. SUCCESS [ 20.985 s]
[INFO] Hive HCatalog ...................................... SUCCESS [ 48.139 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [  5.561 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [  4.961 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [ 25.777 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [  0.893 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 29.462 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [  0.941 s]
[INFO] Hive HPL/SQL ....................................... SUCCESS [ 24.698 s]
[INFO] Hive HWI ........................................... SUCCESS [  0.588 s]
[INFO] Hive Llap External Client .......................... SUCCESS [  0.576 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [  0.022 s]
[INFO] Hive TestUtils ..................................... SUCCESS [  0.041 s]
[INFO] Hive Packaging ..................................... SUCCESS [ 42.897 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 08:57 min
[INFO] Finished at: 2017-07-28T14:56:43+08:00
[INFO] Final Memory: 271M/1300M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "hadoop-2" could not be activated because it does not exist.
您在 /var/spool/mail/root 中有新邮件
[root@cyq apache-hive-2.1.1-src]# 
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值