部署环境:

系统:CentOS 6.4 64bit

Hadoop 版本:hadoop-2.5.1

jdk 版本:jdk-7u65-linux-x64.gz(注意:本人刚开始用的jdk1.8,不行,用的jdk1.7才顺利编译成功)

 1、安装hadoop所需要依赖的包

 yum -y install lzo-devel zlib-devel gcc autoconf automake libtool gcc-c++ 

 yum -y install openssl-devel ncurses-devel

 2、编译hadoop依赖环境

 Ant 

 Maven 

 ProtocolBuffer 

 findbugs 

 CMak

 3、编译安装Protobuf

 tar xf protobuf-2.5.0.tar.gz 

 cd protobuf-2.5.0

 ./configure --prefix=/usr/local/protobuf

 make && make install

 4、编译安装cmake

  tar xf cmake-2.8.12.tar.gz

  cd cmake-2.8.12

  ./bootstrap 

  make && make install

  5、Ant安装

  tar -zxvf apache-ant-1.9.4-bin.tar.gz

  root用户安装

  cp apache-ant-1.9.4 /usr/local/ant

  建一个default的软link,方便后面设置环境还有以后更新

  cd /usr/local/ant

  ln -s usr/local/apache-ant-1.9.4 ant_default

  用root用户进入

  cd /usr/sbin

  建立脚本的软链接

  ln -s /usr/local/ant/ant_default/bin/ant ant

  6、maven安装

  tar -zxvf apache-maven-3.2.3-bin.tar.gz

  cp apache-maven-3.2.3 /usr/local/maven

  cd usr/local/maven

  ln -s /usr/local/apache-maven/-3.2.3 maven_default

  cd /usr/sbin

  ln -s /usr/local/maven_default/bin/mvn mvn

  7、findbugs安装

  tar xf findbugs-2.0.2.tar.gz -C /usr/local/

  8、环境变量配置

  protobuf  

  export PROTOBUF_HOME=/usr/local/protobuf  

  export CLASSPATH=.:$CLASSPATH:$PROTOBUF_HOME/lib 

  export PATH=$PATH:$PROTOBUF_HOME/bin

  findbugs 

  export FINDBUGS_HOME=/usr/local/findbugs-2.0.2

  export CLASSPATH=.:$CLASSPATH:$FINDBUGS_HOME/lib 

  export PATH=$PATH:$FINDBUGS_HOME/bin

  9.加入配置

  cd hadoop-2.5.1-src

  vim hadoop-common-project/hadoop-auth/pom.xml

  加入内容如下:

  <dependency>

    <groupId>org.mortbay.jetty</groupId>

    <artifactId>jetty-util</artifactId>

    <scope>test</scope>

  </dependency>

  <dependency>

    <groupId>org.mortbay.jetty</groupId>

    <artifactId>jetty</artifactId>

    <scope>test</scope>

  </dependency>

  10.测试环境是否配置成功

  java -version

  ant -version

  mvn -version

  findbugs -version

  protoc --version

  11.开始编译 进入hadoop源码包

  cd hadoop-2.5.1-src

  mvn package -Pdist -DskipTests -Dtar

  等待安装大概40分钟左右

[INFO] Reactor Summary:

[INFO] 

[INFO] Apache Hadoop Main ................................. SUCCESS [  5.964 s]

[INFO] Apache Hadoop Project POM .......................... SUCCESS [  4.638 s]

[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 10.227 s]

[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.706 s]

[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  3.956 s]

[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  8.834 s]

[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  6.665 s]

[INFO] Apache Hadoop Auth ................................. SUCCESS [  7.737 s]

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  5.999 s]

[INFO] Apache Hadoop Common ............................... SUCCESS [03:20 min]

[INFO] Apache Hadoop NFS .................................. SUCCESS [ 22.756 s]

[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.137 s]

.............

[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 15.821 s]

[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 33.312 s]

[INFO] Apache Hadoop Archives ............................. SUCCESS [  3.870 s]

[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 15.896 s]

[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 10.355 s]

[INFO] Apache Hadoop Data Join ............................ SUCCESS [  6.217 s]

[INFO] Apache Hadoop Extras ............................... SUCCESS [  6.437 s]

[INFO] Apache Hadoop Pipes ................................ SUCCESS [  0.098 s]

[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 13.306 s]

[INFO] Apache Hadoop Client ............................... SUCCESS [ 16.149 s]

[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.503 s]

[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 26.096 s]

[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 15.652 s]

[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.048 s]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [01:28 min]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 29:39 min

[INFO] Finished at: 2014-10-19T12:06:47+08:00

[INFO] Final Memory: 92M/239M

[INFO] ------------------------------------------------------------------------

[root@hadoop hadoop-2.5.1-src]# 

安装成功

进入编译的hadoop源码包

cd hadoop-2.5.1-src/hadoop-dist/target/

ll 能看到hadoop-2.5.1

hadoop-2.5.1.tar.gz就是你编译好的hadoop


ll hadoop-2.5.1

drwxr-xr-x 2 root root  4096 10月 19 12:05 bin

drwxr-xr-x 3 root root  4096 10月 19 12:05 etc

drwxr-xr-x 2 root root  4096 10月 19 12:05 include

drwxr-xr-x 2 root root  4096 10月 19 12:05 libexec

-rw-r--r-- 1 root root 15458 10月 19 12:05 LICENSE.txt

-rw-r--r-- 1 root root   101 10月 19 12:05 NOTICE.txt

-rw-r--r-- 1 root root  1366 10月 19 12:05 README.txt

drwxr-xr-x 2 root root  4096 10月 19 12:05 sbin

drwxr-xr-x 4 root root  4096 10月 19 12:05 share

嘻嘻,做了这么多的准备,然后去安心的部署hadoop环境吧!