hadoop-2.4.0源码编译过程

系统为ubuntu14.04,32bit,以前一直用官网包(官网为32bit),这次试着自己编译了一次,大致如下:

1.下载hadoop-2.4.0-src.tar.gz源码包

官网下载地址:http://hadoop.apache.org/

网盘下载地址:http://pan.baidu.com/s/1pJ7gdMN

下载完成解压,得到hadoop源码文件夹:hadoop-2.4.0-src

2.安装编译所需的软件:

1).jdk1.7安装(编译时切记不要用jdk1.8):

官网下载地址:http://www.oracle.com/technetwork/java/javase/downloads/index.html

网盘下载地址:http://pan.baidu.com/s/1gd9yrb9

下载之后,解压安装至/usr/local/目录,并配置环境变量:

export JAVA_HOME=/usr/local/jdk1.7.0_79

expoort JRE_HOME=/usr/local/jdk1.7.9_79/jre

export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH

然后退出,source

查看安装是否成功:

java -version

2).maven安装

官网下载地址:http://maven.apache.org/download.cgi

网盘下载地址:http://pan.baidu.com/s/1ntIF1S9

下载完成后解压得到apache-maven-3.2.5,安装至/usr目录下,并配置环境变量:

export MAVEN_HOME=/usr/apache-maven-3.2.5
export MAVEN_OPTS="-Xms128m -Xmx512m"

export PATH=$MAVEN_HOME/bin:$PATH

然后退出,source

验证安装是否成功:

mvn -v

3).Ant 安装

官网下载地址:http://ant.apache.org/bindownload.cgi

网盘下载地址:http://pan.baidu.com/s/1bnASwJd

下载完成后解压得到apache-ant-1.9.4,安装至/usr目录下,配置环境变量:

export ANT_HOME=/usr/apache-ant-1.9.4
export PATH=$ANT_HOME/bin:$PATH

然后退出,source

验证安装是否成功:

ant -version

4).g++安装

  sudo apt-get install g++

5).protobuf安装(hadoop2.4.0适用protobuf2.5.0版本):

下载地址:http://pan.baidu.com/s/1jGotvQA

tar xzf protobuf-2.5.0.tar.gz

cd protobuf-2.5.0

 sudo ./configure --prefix=/usr/protobuf(指定的安装目录)或者 在所安装的文件夹中sudo ./configure

 sudo make
 sudo make install

 sudo ldconfig

配置环境变量:

export PROTOC_HOME=/usr/protobuf
export PATH=$PROTOC_HOME/bin:$PATH

export LD_LIBRARY_PATH=/usr/protobuf-2.5.0

然后退出,source

验证安装是否成功:

hadoop@master:~$ protoc --version
libprotoc 2.5.0
hadoop@master:~$ 

6).cmake安装

下载地址:http://pan.baidu.com/s/1sj7cadf

 tar xzf cmake-2.8.12.2.tar.gz
 cd cmake-2.8.12.2
 sudo ./bootstrap --prefix=/usr/cmake(指定的安装目录)
 sudo make
 sudo make install

配置环境变量:

export CMAKE_HOME=/usr/cmake
export PATH=$CMAKE_HOME/bin:$PATH

7).openssl库安装

sudo apt-get install libssl-dev

8).libglib2.0-dev安装

sudo apt-get install libglib2.0-dev

9).libssl-dev安装

sudo apt-get install libssl-dev

3.编译hadoop-2.4.0

cd /hadoop-2.4.0-src

mvn package -Pdist -DskipTests -Dtar  

注意:hadoop-2.4.0-src权限设置:

sudo chown -R hadoop:hadoop hadoop-2.4.0-src

编译完成后:

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [  4.940 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  2.429 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  3.136 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.521 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  2.996 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  4.248 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  2.875 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  2.447 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  2.249 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:41 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  5.997 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.085 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [04:11 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 19.371 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 31.596 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  4.239 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.054 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.073 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:43 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 52.105 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.114 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [  5.712 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 56.731 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  1.851 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 14.598 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 10.669 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  2.724 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [  3.387 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.290 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.537 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  1.601 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.116 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  9.501 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.075 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 20.997 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 15.518 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  1.601 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [  5.851 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  4.736 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 16.372 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.167 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  3.231 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  5.312 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  8.653 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 23.317 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.464 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  3.259 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  2.664 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  1.524 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  1.646 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  0.035 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  3.909 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  8.041 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.147 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 13.277 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  6.222 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.024 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 39.922 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:00 min
[INFO] Finished at: 2015-04-19T10:52:55+08:00
[INFO] Final Memory: 126M/391M
[INFO] ------------------------------------------------------------------------
hadoop@master:~/hadoop-2.4.0-src$ 

在目录~/hadoop-2.4.0-src/hadoop-dist/target下有文件:hadoop-2.4.0.tar.gz,就是我们需要的文件。

当然这次编译遇到了许多问题,时间原因只记录了几个,详见我的另一博客:hadoop2.4.0源码编译问题


  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值