编译Hdadoop-2.5.0出现的问题解决

本机环境:

    发行版本: CentOS release 5.11 (Final)

内核版本: 2.6.18-398.el5 #1 SMP Tue Sep 16 20:50:52 EDT 2014 x86_64 x86_64 x86_64 GNU/Linux

 

Hadoop版本: Apache2.5.0

 

需要安装的软件包:

* JDK 1.6+

* Maven 3.0 or later

* Findbugs 1.3.9 (if running findbugs)

* ProtocolBuffer 2.5.0

* CMake 2.6 or newer (if compiling native code)

* Zlib devel (if compiling native code)

* openssl devel ( if compiling native hadoop-pipes )

* gcc  

* gcc-c++

* make

* ncurses-devel

* glibc-devel x86_64i386版本的都要安装)

 

编译打包Hadoop源码包使用:mvn package -Pdist,native,docs -DskipTests -Dtar

 

1、Findbugs下载地址:https://sourceforge.net/projects/findbugs/files/findbugs/

        本人使用的是findbugs-2.0.3.tar.gz , 见附件

        下载地址:http://pan.baidu.com/s/1mixv4ne


2、Failed to parse plugin descriptor for org.apache.hadoop:hadoop-maven-plugins:2.2.0 (/Users/Howie/Downloads/release-2.2.0/hadoop-maven-plugins/target/classes): No plugin descriptor found at META-INF/maven/plugin.xml -> [Help 1]

 

解决办法:


  $ cd hadoop-maven-plugins
  $ mvn install


3、 'protoc --version' did not return a version ->


  解决办法:安装ProtocolBuffer 2.5.0


  下载地址: http://pan.baidu.com/s/1qXQJsd  

  源码地址:https://github.com/google/protobuf

 

安装完成之后,查看安装的目录结构是否和下面的一致!!不一致的话,将相应的文件拷贝到这些目录下!!

/usr/bin/protoc
/usr/lib64/libprotoc.so.8
/usr/lib64/libprotoc.so.8.0.0

[yh.zeng@namenode1 protobuf-2.5.0]$ sudo cp bin/protoc /usr/bin/protoc
[yh.zeng@namenode1 protobuf-2.5.0]$ sudo cp lib/libprotoc.so.8 /usr/lib64/
[yh.zeng@namenode1 protobuf-2.5.0]$ sudo cp lib/libprotoc.so.8.0.0 /usr/lib64/

4、使用JDK8 会出现以下错误:

Caused by: org.apache.maven.reporting.MavenReportException:

Exit code: 1 - /usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/InterfaceStability.java:27:错误:意外的结束标记: </ul>

 * </ul>

   ^

 

解决办法:卸载掉JDK8,安装JDK7

 

5、

[ERROR] around Ant part ...<exec

dir="/usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/target/native" executable="make" failοnerrοr="true">... @ 7:130 in /usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/

target/antrun/build-main.xml

[ERROR] -> [Help 1]

在这行报错信息之前,继续找具体原因,我本地机器的报错原因,如下(红色字样):

    [exec] 在包含自 /usr/include/features.h352的文件中,

     [exec]                  /usr/include/stdio.h28

     [exec]                  /usr/java/jdk1.7.0_79/include/jni.h39

     [exec]                  /usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common

/src/main/native/src/exception.h20

     [exec]                  /usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common

/src/main/native/src/exception.c19:

     [exec] /usr/include/gnu/stubs.h:7:27: 错误:gnu/stubs-32.h:没有那个文件或目录

     [exec] make[2]: *** [CMakeFiles/hadoop.dir/main/native/src/exception.c.o]错误1

     [exec] make[1]: *** [CMakeFiles/hadoop.dir/all] 错误 2

     [exec] make: *** [all] 错误 2

     [exec] make[2]: Leaving directory `/usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/target/native'

     [exec] make[1]: Leaving directory `/usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/target/native'

 

解决办法:

#yum -y install glibc-devel.i386


 

6、java.io.IOException: Cannot run program "protoc": error=2, 没有那个文件或目录

   如果你已经安装了ProtocolBuffer 2.5.0,则是由于protoc、libprotoc.so.8libprotoc.so.8.0.0 这几个文件的位置不对!!必须安装下面的文件路径存放:

/usr/bin/protoc
/usr/lib64/libprotoc.so.8
/usr/lib64/libprotoc.so.8.0.0


7、

[ERROR] around Ant part ...<xslt style="${env.FINDBUGS_HOME}/src/xsl/default.xsl" in="/usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/target/findbugsXml.xml" out="/usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/target/site/findbugs.html"/>... @ 44:247 in /usr/local/hadoop-2.5.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml

[ERROR] -> [Help 1]

 

解决方法: 如果你已经正确地安装了Findbug,并配置了FINDBUGS_HOME环境变量(验证方法:在Linux的字符终端输入fb -version ,能够正确地输出Findbugs的版本号),则使用ROOT用户执行编译打包命令(具体为什么出现这种情况,不清楚以后再深入),如下:

[root@namenode1 hadoop-2.5.0-src]# mvn package -Pdist,native,docs -DskipTests -Dtar


 


最终编译成功,如下图:



最终生成的文件,如下:

[yh.zeng@namenode1 target]$ pwd
/usr/local/hadoop-2.5.0-src/hadoop-dist/target
[yh.zeng@namenode1 target]$ ll -h
总计 626M
drwxr-xr-x 2 root root 4.0K 08-27 20:27 antrun
-rw-r--r-- 1 root root 1.6K 08-27 20:27 dist-layout-stitching.sh
-rw-r--r-- 1 root root  640 08-27 20:29 dist-tar-stitching.sh
drwxr-xr-x 9 root root 4.0K 08-27 20:27 hadoop-2.5.0
-rw-r--r-- 1 root root 207M 08-27 20:30 hadoop-2.5.0.tar.gz
-rw-r--r-- 1 root root 2.7K 08-27 20:29 hadoop-dist-2.5.0.jar
-rw-r--r-- 1 root root 419M 08-27 20:34 hadoop-dist-2.5.0-javadoc.jar
drwxr-xr-x 2 root root 4.0K 08-27 20:31 javadoc-bundle-options
drwxr-xr-x 2 root root 4.0K 08-27 20:29 maven-archiver
drwxr-xr-x 2 root root 4.0K 08-27 20:27 test-dir
[yh.zeng@namenode1 target]$ du -sh *
8.0K    antrun
4.0K    dist-layout-stitching.sh
4.0K    dist-tar-stitching.sh
910M    hadoop-2.5.0
207M    hadoop-2.5.0.tar.gz
4.0K    hadoop-dist-2.5.0.jar
419M    hadoop-dist-2.5.0-javadoc.jar
8.0K    javadoc-bundle-options
8.0K    maven-archiver
4.0K    test-dir
[yh.zeng@namenode1 target]$ 



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值