hadoop2.2 源码编译

hadoop2.2源码编译遇到的问题和下文差不多

http://blog.sina.com.cn/s/blog_93d695390101b5vf.html


在YARN自带的软件包中有关于编译的说明,具体参见:BUILDING.txt。
在RetHat中编译YARN,Jdk为1.6.0_23.


1.下载YARN源码包
下载地址:http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.0.2-alpha/
(这里下载的版本为2.0.2),下载hadoop-2.0.2-alpha-src.tar.gz.
下载后解压到/home/wyf目录下。


2.安装Maven
YARN使用Maven作为编译工具,以前没用过,这次只好硬着头皮试试了。首先下载Maven软件包,下载地址为:http://labs.mop.com/apache-mirror/maven/maven-3/3.0.4/binaries/apache-maven-3.0.4-bin.tar.gz
下载后解压,设置MAVEN_HOME和PATH变量。设置好后,可以使用mvn -version检查是否安装成功。


3.安装ProtoBuf
为啥YARN编译使用的工具我都没用过,囧。。看来学习的太少了
同样下载,解压。下载地址:http://protobuf.googlecode.com/files/protobuf-2.4.1.tar.bz2
解压: tar jxvf protobuf-2.4.1.tar.bz2
依次执行:./configure,   make,   sudo make install,这三个命令安装ProtoBuffer。
在执行./configure命令时,可以指定安装目录的前缀(默认安装到/usr/local目录下),指定前缀的方法为(举例):./configure --prefix=/home/wyf/protobuf,这样的话,protobuf就会安装到/home/wyf/protobuf目录下,安装后在该目录下可以看到三个目录:include、bin、lib。(之前我安装好几次都是只有lib目录,没有其余两个,好像是解压方法不对,最后使用上面所写的解压方法后安装成功)。修改/etc/profile文件,加入export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/wyf/protobuf/lib
export PATH=$PATH:/home/wyf/protobuf/bin
最后使用protoc --version检查ProtoBuffer是否安装成功。


4.安装autotool
依然没用过。。。
命令安装:sudo yum install autoconf automake libtool


5.编译
第一次编译保证网络连接,因为要从网络上下载很多支持的软件包。
若解压后的YARN源码位置为/home/wyf/hadoop-2.0.2-alpha-src,则cd到该目录下,即可开始编译。
编译命令:mvn package -Pdist,native -DskipTests
接下来是漫长的等待。。。。直到出现所有Project均SUCCESS的界面,证明编译完成且成功!
编译成功应显示的信息:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [6.256s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [5.964s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [4.507s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.358s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [4.044s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [4.816s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.517s]
[INFO] Apache Hadoop Common .............................. SUCCESS [3:37.475s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [3.391s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [3:25.417s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [45.369s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [19.111s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.225s]
[INFO] hadoop-yarn ....................................... SUCCESS [1.273s]
[INFO] hadoop-yarn-api ................................... SUCCESS [1:49.559s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:14.722s]
[INFO] hadoop-yarn-server ................................ SUCCESS [3.524s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [20.320s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [22.533s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.906s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [28.781s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [2.583s]
[INFO] hadoop-yarn-client ................................ SUCCESS [4.943s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.149s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [3.314s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.364s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [47.326s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [11.138s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.267s]
[INFO] hadoop-yarn-project ............................... SUCCESS [11.935s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [47.074s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [11.811s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [14.339s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [7.141s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [10.101s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [7.732s]
[INFO] hadoop-mapreduce .................................. SUCCESS [8.388s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [5.928s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.718s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [10.997s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [9.091s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [5.791s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [3.456s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [4.658s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [3.120s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [2.242s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.085s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [42.718s]
[INFO] Apache Hadoop Client .............................. SUCCESS [17.661s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.736s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18:21.523s
[INFO] Finished at: Thu Jan 31 15:28:01 CST 2013


6.错误处理
在编译过程中出现了大致两类错误,一是由于网络原因,需要的软件包没有下载下来,导致编译不能继续,编译失败;二是CMake没有安装。
错误1:软件包没下载
在编译的过程中会看到YARN自动download很多需要的软件包,但有时会下载不成功。错误提示大致下面这个样子:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:2.1:build-classpath (build-classpath) on project hadoop-project: Execution build-classpath of goal org.apache.maven.plugins:maven-dependency-plugin:2.1:build-classpath failed: Plugin org.apache.maven.plugins:maven-dependency-plugin:2.1 or one of its dependencies could not be resolved: Could not transfer artifact org.apache.maven.doxia:doxia-core:jar:1.0-alpha-7 from/to central (http://repo.maven.apache.org/maven2): GET request of: org/apache/maven/doxia/doxia-core/1.0-alpha-7/doxia-core-1.0-alpha-7.jar from central failed: Connection reset -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn -rf :hadoop-project
在编译的过程中出现了好几次ResolutionException(还有DependencyResolutionException),遇到这个问题,不用修改东西,直接运行命令继续编译就好。YARN会继续从下载失败的软件包开始下载并继续编译。
错误2:CMake没有安装
错误:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/wyf/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-common/target/native"): java.io.IOException: error=2, No such file or directory -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
这是安装CMake:
sudo yum install cmake
Complete!后继续编译就可以了。


7.验证
验证编译成功的方法最简单的为使用自己编译的jar文件,运行Hadoop系统。将编译生成的jar文件(一般在target目录下,由于YARN各个部分分的很细,jar包分布在好多目录下的target目录中,需要耐心的找找),替换hadoop-2.0.2-alpha/share/hadoop目录下各个部分对应的jar包,配置hadoop(这里略),在master启动hadoop。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值