Mac OS/X Hadoop 2.6 编译安装常见问题总结 / Building Hadoop on OS/X

几点说明

最好能有 MacPort 或者其他的 Brew 类似的工具,这样能节省很多的时间。

几个必须的工具

* Unix System

这个是必须的

* JDK 1.6+

JDK版本不能太高,反正 Hadoop 2.6 在 JDK 1.8 上编译失败好几次,建议用 1.7

* Maven 3.0 or later

这个也是必须的

* Findbugs 1.3.9 (if running findbugs)

Findbugs 可以不用

* ProtocolBuffer 2.5.0

这个必须要

* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac

去 CMake 的官网下就可以

* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes )

这些我电脑里面以前安过了,这次就没装

* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

一定要记得联网,因为在安装的过程中要下载茫茫多的东西。

编译命令

mvn package -Pdist,native,docs -DskipTests -Dtar

常见的几个错误

错误提示

Exception in thread "main" java.lang.AssertionError: Missing tools.jar at: /Library/Java/JavaVirtualMachines/jdk1.7.0_71.jdk/Contents/Home/Classes/classes.jar. Expression: file.exists()

解决办法

sudo mkdir $JAVA_HOME/Classes
sudo ln -sf $JAVA_HOME/lib/tools.jar $JAVA_HOME/Classes/classes.jar

错误提示

Time out

解决办法

这个问题是在安装第5个子项目的时候,提示了一大堆,但是中心思想就是提示超时,可能是我网络太慢的原因吧,重新开始了一次就过了那个点。

最后出现这些提示则说明编译成功

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [  1.642 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.438 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  2.686 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.300 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  2.000 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.672 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  2.789 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  5.065 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  3.743 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:56 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  6.492 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 14.486 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.144 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [03:32 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [03:11 min]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [02:33 min]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  4.413 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.034 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.050 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:28 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [01:10 min]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.133 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 25.965 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [01:16 min]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  3.452 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  8.736 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 23.658 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  6.801 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [  8.720 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.030 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.547 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  2.043 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.065 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [  6.182 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  5.765 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.076 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 25.826 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 16.665 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  3.549 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 11.537 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  8.411 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 22.017 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.753 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  7.949 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  4.888 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 14.673 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 13.820 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  2.740 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  8.339 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  5.882 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  3.981 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.934 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  3.290 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 36.593 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  6.642 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [02:09 min]
[INFO] Apache Hadoop Client ............................... SUCCESS [  9.129 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.281 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  4.498 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 17.674 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.072 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 57.905 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24:35 min
[INFO] Finished at: 2015-02-17T22:13:02+08:00
[INFO] Final Memory: 195M/1051M
[INFO] ------------------------------------------------------------------------

根据说明可以知道,编译后的文件在

{hadoop source dir.}/hadoop-dist/target/hadoop-2.6.0.tar.gz

最后在

{hadoop source dir.}/hadoop-dist/target/hadoop-2.6.0

使用命令

bin/hadoop version

查看版本信息,嗯,没错!就是这个!

Hadoop 2.6.0
Subversion Unknown -r Unknown
Compiled by wangzhiyu on 2015-02-17T13:48Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /Users/wangzhiyu/Downloads/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar

然后是安装

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值