hadoop2.6编译过程(64位)

1.安装JDK,使用 java -version  查看jdk版本,确定JDK版本是64位。(必须是jdk7的4版本以上)
(1)解压jdk
$ tar -xvzf  jdk-7u72-linux-x64.tar.gz

(2)设置环境变量  vim   /etc/profile
#============= java env ============
export JAVA_HOME=/usr/local/jdk1.7.0_72
export CLASSPATH=$CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/jre/lib
export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/lib:$PATH

(3)使配置文件生效
$ source /etc/profile

(4)验证
[root@localhost local]# java -version
java version "1.7.0_72"
Java(TM) SE Runtime Environment (build 1.7.0_72-b14)
Java HotSpot(TM) 64-Bit Server VM (build 24.72-b04, mixed mode)
[root@localhost local]#
 
2.安装Apache-Maven。
(1)解压maven
$ tar -xvzf apache-maven-3.2.5-bin.tar.gz
mv apache-maven-3.2.5 maven

(2)设置环境变量  vim   /etc/profile
#============= maven env ===========
export MAVEN_HOME=/usr/local/maven
export PATH=$PATH:$MAVEN_HOME/bin

(3)使配置文件生效
$ source /etc/profile

(4)验证
[root@localhost local]# mvn -version
Apache Maven 3.2.5 (12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-14T09:29:23-08:00)
Maven home: /usr/local/maven
Java version: 1.7.0_72, vendor: Oracle Corporation
Java home: /usr/local/jdk1.7.0_72/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"
[root@localhost local]#

3.安装Apache-ant。
(1)解压maven
$ tar -xvzf apache-ant-1.9.4-bin.tar.gz
mv apache-ant-1.9.4 ant

(2)设置环境变量  vim   /etc/profile
#============= ant env ===========
export ANT_HOME=/usr/local/ant
export PATH=$PATH:$ANT_HOME/bin

(3)使配置文件生效
$ source /etc/profile

(4)验证
[root@localhost local]# ant -version
Apache Ant(TM) version 1.9.4 compiled on April 29 2014
[root@localhost local]# 


4.安装Apache-ant。
(1)解压maven
$ tar -xvzf apache-ant-1.9.4-bin.tar.gz
mv apache-ant-1.9.4 ant

(2)设置环境变量  vim   /etc/profile
#============= ant env ===========
export ANT_HOME=/usr/local/ant
export PATH=$PATH:$ANT_HOME/bin

(3)使配置文件生效
$ source /etc/profile

(4)验证
[root@localhost local]# ant -version
Apache Ant(TM) version 1.9.4 compiled on April 29 2014
[root@localhost local]# 

5. 安装findbugs
(1)解压findbugs
$ tar -xvzf findbugs-2.0.3.tar.gz
mvfindbugs-2.0.3 findbugs

(2)设置环境变量  vim   /etc/profile
#=============findbugs env ===========
export FINDBUGS_HOME=/usr/local/findbugs
export PATH=$PATH:$FINDBUGS_HOME/bin

(3)使配置文件生效
$ source /etc/profile

(4)验证
[root@localhost local]# findbugs -version
2.0.3
[root@localhost local]# 

6. 装protobuf(goole序列化工具)
(1)解压findbugs
$ tar -xvzf protobuf-2.5.0.tar.gz
(2) 安装编译需要的工具
安装 yum install gcc   
安装 yum install gcc-c++
安装 yum install cmake
安装 yum install openssl-devel
安装 yum install ncurses-devel
(3) 验证安装工具
验证gcc
[root@localhost local]# gcc -v
Using built-in specs.
Target: x86_64-redhat-linux
Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --with-bugurl=http://bugzilla.redhat.com/bugzilla --enable-bootstrap --enable-shared --enable-threads=posix --enable-checking=release --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions --enable-gnu-unique-object --enable-languages=c,c++,objc,obj-c++,java,fortran,ada --enable-java-awt=gtk --disable-dssi --with-java-home=/usr/lib/jvm/java-1.5.0-gcj-1.5.0.0/jre --enable-libgcj-multifile --enable-java-maintainer-mode --with-ecj-jar=/usr/share/java/eclipse-ecj.jar --disable-libjava-multilib --with-ppl --with-cloog --with-tune=generic --with-arch_32=i686 --build=x86_64-redhat-linux
Thread model: posix
gcc version 4.4.7 20120313 (Red Hat 4.4.7-11) (GCC)
[root@localhost local]#

验证make
[root@localhost local]# make -version
GNU Make 3.81
Copyright (C) 2006  Free Software Foundation, Inc.
This is free software; see the source for copying conditions.
There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE.

This program built for x86_64-redhat-linux-gnu
[root@localhost local]#

 验证cmake
[root@localhost local]# cmake -version
cmake version 2.8.12.2
[root@localhost local]#
(4)进行编译
[root@localhost protobuf-2.5.0]# pwd
/usr/local/protobuf-2.5.0
[root@localhost protobuf-2.5.0]#./configure prefix=/usr/local/protoc
[root@localhost protobuf-2.5.0]# make && make install


(5)设置环境变量  vim   /etc/profile
#============= protobuf env ===========
export PROTOC_HOME=/usr/local/protoc
export PATH=.:$PROTOC_HOME/bin:$PATH

(6)使配置文件生效
$ source /etc/profile
(4)验证
注意这里version前面是两个减号
[root@localhost protobuf-2.5.0]# protoc --version
libprotoc 2.5.0
[root@localhost protobuf-2.5.0]#

 7.设置内存
在编译之前防止 java.lang.OutOfMemoryError: Java heap space   堆栈问题,在centos系统中执行命令:
$ export MAVEN_OPTS="-Xms256m -Xmx512m"


8.解压编译
(1)解压hadoop-2.6.0
tar -zxvf hadoop-2.6.0-src.tar.gz 

(2)执行命令  cd  ${hostname_Local}/hadoop-2.6.0/  目录下
(3)编译
mvn package -DskipTests -Pdist,native
(4)编译好后的目录
编译好的项目放在  hadoop-2.6.0-src/hadoop-dist/target目录下。
[root@localhost target]# ls
antrun                    hadoop-2.6.0           hadoop-dist-2.6.0-javadoc.jar  maven-archiver
dist-layout-stitching.sh  hadoop-dist-2.6.0.jar  javadoc-bundle-options         test-dir
[root@localhost target]# pwd
/usr/local/hadoop/hadoop-2.6.0-src/hadoop-dist/target
其中hadoop-2.6.0就是最后编译成功后的文件。


[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [  2.955 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.990 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  3.062 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.349 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  2.043 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  4.806 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  3.387 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  4.495 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  4.192 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:54 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  7.996 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 12.124 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.047 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [03:30 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 22.279 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [  7.885 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  5.057 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.045 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.032 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:29 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 35.781 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.069 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 14.083 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 20.344 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  3.344 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  7.385 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 22.124 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  6.307 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [  8.961 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.094 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.914 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  1.742 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.089 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [  6.321 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  3.102 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.053 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 24.774 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 18.711 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  4.501 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 12.472 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  9.420 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [  6.523 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.746 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  6.753 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  3.503 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  5.307 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 10.503 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  2.527 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  6.462 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  4.619 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.883 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.287 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  3.916 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  2.126 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  7.429 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [05:47 min]
[INFO] Apache Hadoop Client ............................... SUCCESS [  3.927 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.088 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  6.262 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 13.865 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.018 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 29.942 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:32 min
[INFO] Finished at: 2015-02-08T07:09:15-08:00
[INFO] Final Memory: 89M/403M
[INFO] ------------------------------------------------------------------------
[root@localhost hadoop-2.6.0-src]#  

9.在编译中遇见问题与解决方法
(1)问题:开始安装的是jdk6在编译时发现需要jdk7的4版本
 Apache Hadoop OpenStack support .................... SUCCESS [  7.304 s]
[INFO] Apache Hadoop Amazon Web Services support .......... FAILURE [05:59 min]
[INFO] Apache Hadoop Client ............................... SKIPPED
[INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
[INFO] Apache Hadoop Tools Dist ........................... SKIPPED
[INFO] Apache Hadoop Tools ................................ SKIPPED
[INFO] Apache Hadoop Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:52 min
[INFO] Finished at: 2015-02-08T05:51:32-08:00
[INFO] Final Memory: 88M/351M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hadoop-aws: Could not resolve dependencies for project
org.apache.hadoop:hadoop-aws:jar:2.6.0: Failed to collect dependencies at
com.amazonaws: aws-java-sdk:jar:1.7.4 -> joda-time:joda-time:jar:2.5:
Failed to read artifact descriptor for joda-time:joda-time:jar:2.5:
Could not transfer artifact joda-time:joda-time:pom:2.5 from/to central
(https://repo.maven.apache.org/maven2): Connection reset -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-aws
[root@localhost hadoop-2.6.0-src]# 

解决方法:安装 jdk-7u72-linux-x64.tar.gz,问题解决,所以hadoop2.6.0必须用jdk7版本4以上的。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值