redhat7源码编译hadoop2.6.0

        以前在32位linux机器上编译过hadoop2.6.0,这次在redhat7 64bit上再次编译hadoop2.6.0,除必须的jdk,maven,protobuf需要安装之外,还需要安装系统依赖库gcc,gcc-c++,ncurses-devel,openssl-devel,cmake,这些依赖库的安装可以直接通过yum命令一下安装: yum install -y gcc gcc-c++ cmake openssl-devel ncurses-devel,依赖库之间通过空格分割。

        这次的安装还是遇到了只能使用jdk1.7编译的问题。如果使用jdk1.8,会报如下图所示的错误。[ERROR] Exit code:1 - /home/hadoop/hadoop-2.6.0-src/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/InterfaceStability.java:27: error: unexcepted end tag: </ul>

   

遇到这个错误,我们只能将jdk换成1.7,之后顺利编译通过。
编译命令:mvn package -DskipTests -Pdist,native -Dtar
build success
INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [  1.041 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.018 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.937 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.255 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 15.730 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 14.448 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 25.414 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 15.027 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  7.052 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [02:01 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  8.556 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [02:31 min]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.029 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [02:46 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [02:42 min]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 14.788 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  3.519 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.029 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.037 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:12 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 25.486 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.044 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 11.236 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 19.452 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  2.385 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  5.028 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 15.531 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  4.019 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [  5.003 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.044 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.060 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  1.525 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.047 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [  4.122 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  5.211 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.051 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 16.098 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 12.579 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  3.310 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [  6.972 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  5.790 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [  7.923 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.437 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  4.272 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  4.270 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  5.100 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 12.644 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.797 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  4.551 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  3.380 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.232 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.054 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  2.312 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  8.269 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  4.217 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 29.522 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  7.187 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.106 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  3.483 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  8.224 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.025 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 49.890 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18:13 min
[INFO] Finished at: 2017-06-23T03:56:38+08:00
[INFO] Final Memory: 157M/350M
[INFO] ------------------------------------------------------------------------
截图如下:
如果不通过源码编译安装,在hadoop,spark,flink等启动时经常会提示一个警告:
2017-06-23 21:57:49,105 WARN  org.apache.hadoop.util.NativeCodeLoader  - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

luffy5459

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值