Hadoop2.8.3版本编译

概述

通过源码方式maven编译获取(本实验使用源码编译方式)

1. 安装前准备

1.1 下载HADOOP源码编译软件包与依赖包

下载地址

[root@hadoop ~]# cd /tmp/
[root@hadoop tmp]# wget https://github.com/apache/hadoop/archive/rel/release-2.8.3.tar.gz
[root@hadoop tmp]# wget wget ftp://ftcp.netbsd.org/pub/pkgsrc/distfiles/protobuf-2.5.0.tar.gz
[root@hadoop tmp]# wget --no-check-certificate https://sourceforge.net/projects/findbugs/files/findbugs/1.3.9/findbugs-1.3.9.tar.gz/download -O findbugs-1.3.9.tar.gz
1.2 Oracle jdk1.8安装部署(Open jdk尽量不要使用)

下载地址

[root@hadoop tmp]# yum install -y lszrz
[root@hadoop tmp]# ls
jdk-8u151-linux-x64.tar.gz  release-2.8.3.tar.gz

####创建目录CDH  java环境默认读取此目录
[root@hadoop tmp]# mkdir /usr/java
[root@hadoop tmp]# tar xf jdk-8u151-linux-x64.tar.gz -C /usr/java/
[root@hadoop tmp]# ln -s /usr/java/jdk1.8.0_151/ /usr/java/jdk
[root@hadoop tmp]# vim /etc/profile

####设置全局环境变量
[root@hadoop local]# vim /etc/profile
JAVA_HOME=/usr/java/jdk
PATH=$JAVA_HOME/bin::$PATH
CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export JAVA_HOME
export PATH
export CLASSPATH
[root@hadoop tmp]# source /etc/profile

####验证环境变量
[root@hadoop ~]# which mvn
/usr/local/apache-maven/bin/mvn
1.3 maven安装部署(版本3.3.9)

下载地址

[root@hadoop tmp]# wget https://github.com/apache/hadoop/archive/rel/release-2.8.3.tar.gz
[root@hadoop tmp]# tar xf apache-maven-3.5.2-bin.tar.gz -C /usr/local
[root@hadoop tmp]# cd /usr/local/
[root@hadoop local]# ln -s apache-maven-3.5.2/ apache-maven

####设置全局环境变量
[root@hadoop local]# vim /etc/profile

####CDH jvm环境变量默认目录/usr/java
JAVA_HOME=/usr/java/jdk
MAVEN_HOME=/usr/local/apache-maven
PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$PATH
CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export JAVA_HOME
export MAVEN_HOME
export PATH
export CLASSPATH
[root@hadoop tmp]# source /etc/profile

####验证环境变量
[root@hadoop ~]# which java
/usr/java/jdk/bin/java


####使用当时最新版本maven 3.5.2编译会报错

2. HADOOP安装部署

2.1 配置hadoop用户与用户组
[root@hadoop ~]# useradd -u 515 -m  hadoop -s /bin/bash
2.2 解压hadoop源码编译

参考地址

[root@hadoop ~]# yum install screen -y
[root@hadoop ~]# cd /tmp/
[root@hadoop tmp]# yum install -y cmake gcc gcc-c++

####protobuf安装
[root@hadoop tmp]# tar xf protobuf-2.5.0.tar.gz
[root@hadoop tmp]# cd protobuf-2.5.0
[root@hadoop protobuf-2.5.0]# ./configure && make && make install

####Findbugs安装
[root@hadoop tmp]# tar xf findbugs-1.3.9.tar.gz
[root@hadoop tmp]# mv findbugs-1.3.9 /usr/local/
[root@hadoop tmp]# mv findbugs-1.3.9 /usr/local/
[root@hadoop tmp]# vim /etc/profile
JAVA_HOME=/usr/java/jdk
MAVEN_HOME=/usr/local/apache-maven
FINDBUGS_HOME=/usr/local/findbugs
HADOOP_HOME=/usr/local/hadoop
PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$FINDBUGS_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH
CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export JAVA_HOME
export MAVEN_HOME
export FINDBUGS_HOME 
export FORREST_HOME
export HADOOP_HOME
export PATH
export CLASSPATH
[root@hadoop ~]# source /etc/profile

####forrest安装
[root@hadoop tmp]# tar xf apache-forrest-0.9-sources.tar.gz
[root@hadoop tmp]# tar xf apache-forrest-0.9-dependencies.tar.gz
[root@hadoop tmp]# cp -aPr apache-forrest-0.9 /usr/local/
[root@hadoop tmp]# ln -s /usr/local/apache-forrest-0.9 /usr/local/apache-forrest
[root@hadoop tmp]# vim /etc/profile
JAVA_HOME=/usr/java/jdk
MAVEN_HOME=/usr/local/apache-maven
FINDBUGS_HOME=/usr/local/findbugs
FORREST_HOME=/usr/local/apache-forrest
HADOOP_HOME=/usr/local/hadoop
PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$FINDBUGS_HOME/bin:$FORREST_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH
CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export JAVA_HOME
export MAVEN_HOME
export FINDBUGS_HOME 
export FORREST_HOME
export HADOOP_HOME
export PATH
export CLASSPATH
[root@hadoop ~]# source /etc/profile

####其他依赖
yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
yum install -y ant patch

[root@hadoop tmp]# tar xf release-2.8.3.tar.gz 
[root@hadoop tmp]# cd hadoop-rel-release-2.8.3/

####hadoop源码编译参考文档
[root@hadoop001 hadoop-rel-release-2.8.3]# cat BUILDING.txt
Build instructions for Hadoop

----------------------------------------------------------------------------------
Requirements:

* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

Building distributions:

Create binary distribution without native code and without documentation:
  $ mvn package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true

Create binary distribution with native code and with documentation:
  $ mvn package -Pdist,native,docs -DskipTests -Dtar

Create source distribution:
  $ mvn package -Psrc -DskipTests

Create source and binary distributions with native code and documentation:
  $ mvn package -Pdist,native,docs,src -DskipTests -Dtar

Create a local staging version of the website (in /tmp/hadoop-site)
  $ mvn clean site -Preleasedocs; mvn site:stage -DstagingDirectory=/tmp/hadoop-site



[root@hadoop hadoop-rel-release-2.8.3]# screen -S hadoop-complie
[root@hadoop hadoop-rel-release-2.8.3]# source /etc/profile
[root@hadoop hadoop-rel-release-2.8.3]# mvn clean package -Pdist,native -DskipTests -Dtar

Ctrl+a+b 暂时断开screen会话
[root@hadoop hadoop-rel-release-2.8.3]# screen -list
There is a screen on:
    29028.hadoop-complie    (Detached)


     [exec] Hadoop dist tar available at: /tmp/hadoop-rel-release-2.8.3/hadoop-dist/target/hadoop-2.8.3.tar.gz
     [exec] 
[INFO] Executed tasks
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [ 16.396 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  0.635 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.771 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  2.545 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.190 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.394 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.957 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  4.994 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  5.381 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  3.455 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:22 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  5.536 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 25.214 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.046 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [01:15 min]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [01:12 min]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [  9.247 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 26.437 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 13.187 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  3.630 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.035 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.034 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 17.175 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [01:23 min]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.040 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [  6.368 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 15.875 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  3.191 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 21.611 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 23.891 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  1.207 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [  5.441 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  3.466 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  3.128 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.030 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  2.742 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  1.930 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.050 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  4.734 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [  5.174 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.141 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 24.020 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 17.155 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  3.572 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [  9.888 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  5.123 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [ 10.669 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  2.302 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  5.157 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [  3.424 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  7.017 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  5.061 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  2.145 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  2.210 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  5.128 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  4.210 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.330 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.208 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  3.015 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  8.250 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  4.430 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 17.715 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  7.833 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  5.975 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.898 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  4.790 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 12.119 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  6.030 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.043 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 34.428 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:53 min
[INFO] Finished at: 2017-12-22T17:39:10+08:00
[INFO] Final Memory: 241M/592M
[INFO] ------------------------------------------------------------------------

real    12m55.948s
user    19m56.246s
sys     1m13.708s


####编译成功tar包
[root@hadoop hadoop-rel-release-2.8.3]# cd hadoop-dist/target/
[root@hadoop target]# ls hadoop-2.8.3.tar.gz 
hadoop-2.8.3.tar.gz

提醒:
1、有时候编译过程中会出现下载某个包的时间太久,这是由于连接网站的过程中会出现假死,
此时按ctrl+c,重新运行编译命令。
2、如果出现缺少了某个文件的情况,则要先清理maven(使用命令 mvn clean) 再重新编译。

参考地址

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project hadoop-common: An Ant BuildException has occured: stylesheet /tmp/hadoop-rel-release-2.8.3/hadoop-common-project/hadoop-common/${env.FINDBUGS_HOME}/src/xsl/default.xsl doesn't exist.
[ERROR] around Ant part ...<xslt in="/tmp/hadoop-rel-release-2.8.3/hadoop-common-project/hadoop-common/target/findbugsXml.xml" style="${env.FINDBUGS_HOME}/src/xsl/default.xsl" out="/tmp/hadoop-rel-release-2.8.3/hadoop-common-project/hadoop-common/target/site/findbugs.html"/>... @ 33:251 in /tmp/hadoop-rel-release-2.8.3/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common



编译过程中碰到的问题:
[ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:2.7.1: Could not transfer artifact org.apache.commons:commons-math3:jar:3.1.1 from/to nexus-osc (http://maven.oschina.net/content/groups/public/): GET request of: org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar from nexus-osc failed: Premature end of Content-Length delimited message body (expected: 1599627; received: 866169 -> [Help 1]

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.7.1:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: ‘protoc –version’ did not return a version -> [Help 1]

缺这缺那的,用thrift编译说明提到的一个把开发工具全装上。

yum -y groupinstall “Development Tools”

需要安装ant, yum install ant

Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program “cmake” (in directory “/root/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/native”): error=2, No such file or directory

需要安装 findbugs  

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project hadoop-common: An Ant BuildException has occured: stylesheet /home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/${env.FINDBUGS_HOME}/src/xsl/default.xsl doesn’t exist.

[ERROR] around Ant part …<xslt in=”/home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/findbugsXml.xml” style=”${env.FINDBUGS_HOME}/src/xsl/default.xsl” out=”/home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/site/findbugs.html”/>… @ 43:251 in /home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml

然后设置环境变量  export FINDBUGS_HOME=/usr/local/findbugs-3.0.0

需要安装cmake

Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on
project hadoop-pipes: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part …<exec dir=”/home/pory/workplace/hadoop-2.4.1-src/hadoop-
tools/hadoop-pipes/target/native” executable=”cmake” failonerror=”true”>… @ 5:131 in
/home/pory/workplace/hadoop-2.4.1-src/hadoop-tools/hadoop-pipes/target/antrun/build-
main.xml
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值