Hadoop2.6.0源码编译

Hadoop2.6.0源码编译

一、编译前需要准备的工具

       HADOOP:  hadoop-2.6.0-src.tar.gz
     JDK:  jdk-7u71-linux-x64.tar.gz
     MAVEN:  apache-maven-3.0.5-bin.tar.gz
     PROTOBUF:  protobuf-2.5.0.tar.gz
     FINDBUGS:   findbugs-3.0.0.tar.gz
     ANT:  apache-ant-1.9.4-bin.tar.gz 

二、下载必要的组件

1.    下载hadoop源码 (当前最新的稳定版是2.6.0)

地址 http://mirrors.hust.edu.cn/apache/hadoop/common/stable/hadoop-2.6.0-src.tar.gz 

2.    下载apache-ant (centos自带的ant版本太低,编译过程中会报错)

地址:http://mirrors.cnnic.cn/apache//ant/binaries/apache-ant-1.9.4-bin.zip (最新版本即可)

3.    下载protobuf-2.5.0.tar.gz (这是google出品的一个数据传输格式)

地址:https://developers.google.com/protocol-buffers/docs/downloads (官网地址要翻!墙!,百度上也能找到国内下载地址)
注意:hadoop2.6.0必须配protobuf 2.5.0版本,版本不匹配,编译将失败

4.    下载findbugs

地址:http://prdownloads.sourceforge.net/findbugs/findbugs-3.0.1.tar.gz?download (最新版本即可)

5.    下载maven

地址:http://maven.apache.org/download.cgi (下载最新版即可,本文中用的是3.2.5)

6.    下载jdk

地址:这个比较容易找,大家自己去oracle官网找着,jdk1.6 及以上(本文用的是1.7)

三、分别解压hadoop、JDK、MAVEN 、PROTOBUF 、FINDBUGS 、ANT。这里以/app为根目 录

      tar -zxvf   hadoop-2.6.0-src.tar.gz
      tar  -zxvf   jdk-7u71-linux-x64.tar.gz
      tar  -zxvf  apache-maven-3.0.5-bin.tar.gz
      tar  -zxvf   protobuf-2.5.0.tar.gz
      tar  -zxvf   findbugs-3.0.0.tar.gz
      tar  -zxvf  apache-ant-1.9.4-bin.tar.gz 

四、重命名

      mv jdk-7u71-linux-x64   jdk7
      mvprotobuf-2.5.0  protobuf
      mvfindbugs-3.0.0  findbugs 
      mv apache-ant-1.9.4 ant 

五、配置环境变量

     export JAVA_HOME=/app/jdk7  exportMAVEN_HOME=/app/apache-maven-3.0.5
      export ANT_HOME=/app/ant
      export FINDBUGS_HOME=/app/findbugs
      exportCLASSPATH=.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib
      export     PATH=.:$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$MAVEN_HOME/bin:$ANT_HOME/bin:$FINDBUGS_HOME/bin:$PATH:$CLASSP   ATH

六、修改配置文件

      <localRepository>/home/cargo/m2</localRepository> 这里设置一个下载jar包的本机仓库目录,通常编译不会很顺利,如果因网络问题下载jar包失败,下次再运行安装时, 需要手动清空本地仓库的缓存,指定一个好记的目录,清理起来比较方便
      vi/app/apache-maven-3.3.1/conf/settings.xml
      更改maven资料库,在<mirrors></mirros>里添加如下内容:
   <mirror>
            <id>nexus-osc</id>

            <mirrorOf>*</mirrorOf>
            <name>Nexusosc</name>
            <url>http://maven.oschina.net/content/groups/public/</url>
       </mirror>
      在<profiles></profiles>内新添加
      <profile>
                 <id>jdk-1.7</id>

                 <activation>
                   <jdk>1.7</jdk>
                  </activation>
                 <repositories>
                  <repository>
                       <id>nexus</id>
                       <name>local private nexus</name>
                      <url>http://maven.oschina.net/content/groups/public/</url>
                      <releases>
                                   <enabled>true</enabled>
                        </releases>
                        <snapshots>
                               <enabled>false</enabled>
                          </snapshots>
                     </repository>
             </repositories>
            <pluginRepositories>
                    <pluginRepository>
                          <id>nexus</id>
                          <name>local privatenexus</name>
                         <url>http://maven.oschina.net/content/groups/public/</url>
                          <releases>
                             <enabled>true</enabled>
                          </releases>
                          <snapshots>
                             <enabled>false</enabled>
                          </snapshots>
                   </pluginRepository>
            </pluginRepositories>
  </profile>

七、分别验证安装是否成功

  java -version

            java version "1.7.0_71"
            Java(TM)SE Runtime Environment (build 1.7.0_71-b14)
            JavaHotSpot(TM) 64-Bit Server VM (build 24.71-b01, mixed mode)
    mvn -version

            Apache Maven 3.0.5(r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 21:51:28+0800)
            Mavenhome: /app/apache-maven-3.0.5
            Javaversion: 1.7.0_71, vendor: Oracle Corporation
            Java home: /app/jdk7/jre
            Defaultlocale: zh_CN, platform encoding: UTF-8
            OSname: "linux", version: "2.6.32-431.el6.x86_64", arch:"amd64", family: "unix"
    ant -version
        Apache Ant(TM) version 1.9.4 compiled onApril 29 2014
    findbugs -version
        3.0.0
如果出现上面的提示,说明 安装成功。

八、安装 protobuf所有依赖

    yum install gcc

    yum intall gcc-c++
    yum install make
    yum install cmake
    yum install openssl-devel
    yum install ncurses-devel

九、安装protobuf-2.5.0.tar.gz

     cd /app/profobuf 目录中执行如下登命令:

      ./configure 
      make
      make check
      make install

十、编译 Hadoop2.6.0

      cdhadoop-2.6.0-src
      mvnpackage -DskipTests -Pdist,native –Dtar

在等待n久之后,可以看到如下的结果:
[INFO] Reactor Summary:
[INFO]

[INFO] Apache Hadoop Main................................. SUCCESS [  4.401 s]
[INFO] Apache Hadoop Project POM.......................... SUCCESS [  3.864 s]
[INFO] Apache Hadoop Annotations.......................... SUCCESS [  7.591 s]
[INFO] Apache Hadoop Assemblies........................... SUCCESS [  0.535 s]
[INFO] Apache Hadoop Project Dist POM .....................SUCCESS [  3.585 s]
[INFO] Apache Hadoop Maven Plugins........................ SUCCESS [  6.623 s]
[INFO] Apache Hadoop MiniKDC.............................. SUCCESS [  4.722 s]
[INFO] Apache Hadoop Auth................................. SUCCESS [  7.787 s]
[INFO] Apache Hadoop Auth Examples........................ SUCCESS [  5.500 s]
[INFO] Apache Hadoop Common............................... SUCCESS [02:47 min]
[INFO] Apache Hadoop NFS.................................. SUCCESS [ 12.793 s]
[INFO] Apache Hadoop KMS.................................. SUCCESS [ 20.443 s]
[INFO] Apache Hadoop Common Project....................... SUCCESS [  0.111 s]
[INFO] Apache Hadoop HDFS................................. SUCCESS [04:35 min]
[INFO] Apache Hadoop HttpFS............................... SUCCESS [ 29.896 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal.............. SUCCESS [ 11.100 s]
[INFO] Apache Hadoop HDFS-NFS............................. SUCCESS [  8.262 s]
[INFO] Apache Hadoop HDFS Project......................... SUCCESS [  0.069 s]
[INFO] hadoop-yarn........................................ SUCCESS [  0.066 s]
[INFO] hadoop-yarn-api.................................... SUCCESS [02:05 min]
[INFO] hadoop-yarn-common .................................SUCCESS [ 46.132 s]
[INFO] hadoop-yarn-server................................. SUCCESS [  0.123 s]
[INFO] hadoop-yarn-server-common.......................... SUCCESS [ 19.166 s]
[INFO] hadoop-yarn-server-nodemanager..................... SUCCESS [ 25.552 s]
[INFO] hadoop-yarn-server-web-proxy....................... SUCCESS [  5.456 s]
[INFO]hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 11.781 s]
[INFO] hadoop-yarn-server-resourcemanager................. SUCCESS [ 30.557 s]
[INFO] hadoop-yarn-server-tests........................... SUCCESS [  9.765 s]
[INFO] hadoop-yarn-client................................. SUCCESS [ 14.016 s]
[INFO] hadoop-yarn-applications........................... SUCCESS [  0.101 s]
[INFO] hadoop-yarn-applications-distributedshell.......... SUCCESS [  4.116 s]
[INFO]hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  2.993 s]
[INFO] hadoop-yarn-site................................... SUCCESS [  0.093 s]
[INFO] hadoop-yarn-registry ...............................SUCCESS [  9.036 s]
[INFO] hadoop-yarn-project................................ SUCCESS [  6.557 s]
[INFO] hadoop-mapreduce-client............................ SUCCESS [  0.267 s]
[INFO] hadoop-mapreduce-client-core .......................SUCCESS [ 36.775 s]
[INFO] hadoop-mapreduce-client-common..................... SUCCESS [ 28.049 s]
[INFO] hadoop-mapreduce-client-shuffle.................... SUCCESS [  7.285 s]
[INFO] hadoop-mapreduce-client-app........................ SUCCESS [ 17.333 s]
[INFO] hadoop-mapreduce-client-hs......................... SUCCESS [ 15.283 s]
[INFO] hadoop-mapreduce-client-jobclient.................. SUCCESS [  7.110 s]
[INFO] hadoop-mapreduce-client-hs-plugins................. SUCCESS [  3.843 s]
[INFO] Apache Hadoop MapReduce Examples................... SUCCESS [ 12.559 s]
[INFO] hadoop-mapreduce................................... SUCCESS [  6.331 s]
[INFO] Apache Hadoop MapReduce Streaming.................. SUCCESS [ 45.863 s]
[INFO] Apache Hadoop Distributed Copy..................... SUCCESS [ 46.304 s]
[INFO] Apache Hadoop Archives............................. SUCCESS [  3.575 s]
[INFO] Apache Hadoop Rumen................................ SUCCESS [ 12.991 s]
[INFO] Apache Hadoop Gridmix ..............................SUCCESS [ 10.105 s]
[INFO] Apache Hadoop Data Join............................ SUCCESS [  5.021 s]
[INFO] Apache Hadoop Ant Tasks............................ SUCCESS [  3.804 s]
[INFO] Apache Hadoop Extras ...............................SUCCESS [  5.298 s]
[INFO] Apache Hadoop Pipes................................ SUCCESS [ 10.290 s]
[INFO] Apache Hadoop OpenStack support.................... SUCCESS [  9.220 s]
[INFO] Apache Hadoop Amazon Web Services support.......... SUCCESS [11:12 min]
[INFO] Apache Hadoop Client............................... SUCCESS [ 10.714 s]
[INFO] Apache Hadoop Mini-Cluster......................... SUCCESS [  0.143 s]
[INFO] Apache Hadoop Scheduler Load Simulator............. SUCCESS [  7.664 s]
[INFO] Apache Hadoop Tools Dist........................... SUCCESS [ 29.970 s]
[INFO] Apache Hadoop Tools................................ SUCCESS [  0.057 s]
[INFO] Apache Hadoop Distribution......................... SUCCESS [ 49.425 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO]------------------------------------------------------------------------
[INFO] Total time: 32:26 min
[INFO] Finished at: 2015-03-19T19:56:40+08:00
[INFO] Final Memory: 99M/298M
[INFO]------------------------------------------------------------------------

编译成功后会打包,放在hadoop-dist/target

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值