HADOOP编译

1.编译Hadoop的环境要求(摘抄自GitHub上Apache/Hadoop的BUILDING.txt):

* Unix系统
* JDK 1.8
* Maven 3.3或更高版本
* ProtocolBuffer 2.5.0
* CMake 3.1或更新(如果编译本机代码)
* Zlib devel(如果编译本地代码)
* Cyrus SASL devel(如果编译本机代码)
* One of the compilers that support thread_local storage:GCC 4.8.1或更高版本,Visual Studio,Clang(社区版本),Clang(适用于iOS 9和更高版本的版本)(如果编译本机代码)
* openssl devel(如果编译本机hadoop管道并获得最佳HDFS加密性能)
* Linux FUSE(用户空间中的文件系统)2.6或更高版本(如果编译fuse_dfs)
* Jansson C XML解析库(如果编译libwebhdfs)
* Doxygen(如果编译libhdfspp并生成文档)
* 用于第一次构建的Internet连接(用于获取所有Maven和Hadoop依赖项)
* python(for releasedocs)
* bats(用于shell代码测试)

* Node.js / bower / Ember-cli(用于YARN UI v2构建)

2.编译环境准备

*我本地jdk与maven都已经配置好,可参考我的博客中的jdk与maven的安装教程

[root@hadoop002 ~]# java -version
java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
[root@hadoop002 ~]# mvn -v
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /usr/local/src/maven
Java version: 1.8.0_161, vendor: Oracle Corporation
Java home: /usr/java/jdk1.8.0_161/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"



2.1安装 ProtocolBuffer 2.5.0
创建安装目录并进入
mkdir /opt/software
cd /opt/software
使用rz命令上传protobuf-2.5.0.tar.gz,解压
tar -zxvf protobuf-2.5.0.tar.gz
进入解压后的目录
cd protobuf-2.5.0
安装gcc组件
yum install -y gcc gcc-c++ make cmake
配置安装目录
./configure --prefix=/usr/local/src/protobuf
编译安装
make && make install
安装完成之后配置环境变量
vi /etc/profile
添加下面第一行,修改PATH,如下面中第二行
export PROTOC_HOME=/usr/local/src/protobuf
export PATH=$PROTOC_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH

使配置文件生效

source /etc/profile
检查是否安装完成
protoc --version
显示 libprotoc 2.5.0 如说明安装成功


2.2安装 findbugs-1.3.9.zip
进入/opt/software,上传文件并解压
cd /opt/software
unzip findbugs-1.3.9.zip
配置环境变量
vi /etc/profile
添加下面第一行,修改PATH,如下面中第二行
export FINDBUGS_HOME=/opt/software/findbugs-1.3.9
export PATH=$FINDBUGS_HOME/bin:$PROTOC_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
使配置文件生效
source /etc/profile
检查是否安装配置成功
 findbugs -version
出现1.3.9说明成功
2.3安装其他依赖
yum install -y openssl openssl-devel svn nvurses-devel zlib-devel libtool
yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
2.4将hadoop源码包上传至 /opt/sourcecode 中并解压
cd /opt/sourcecode
tar -zxvf hadoop-2.8.3-src.tar.gz
3编译hadoop

进入到hadoop解压之后的根目录

cd /opt/sourcecode/hadoop-2.8.3-src
运行maven命令编译hadoop
mvn clean package -Pdist,native -DskipTests -Dtar
如果最后有好多组件都是Skipd,重新运行maven命令,重新编译即可

编译成功会显示一下内容

[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [  1.577 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  1.199 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.518 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  5.379 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.385 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.733 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  5.414 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  7.091 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 50.025 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 21.039 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [03:55 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 16.017 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 15.423 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.067 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [01:18 min]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [01:47 min]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [ 13.954 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 23.811 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 17.039 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 17.121 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.052 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.067 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 30.257 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [01:08 min]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.042 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 10.273 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 20.983 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  5.073 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [  9.102 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 42.903 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  2.372 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [  8.963 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  4.908 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  5.405 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.060 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  4.179 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  2.929 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.312 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  6.255 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [  5.161 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.192 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 44.728 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 25.745 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  4.680 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [ 12.156 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  7.107 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [  9.399 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  2.791 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  7.445 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [  3.707 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  5.327 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  6.529 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  3.664 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  4.140 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  7.198 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 10.784 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  3.224 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.552 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  4.611 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  9.819 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  5.859 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [  7.436 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  8.711 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  7.028 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.339 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 12.959 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [  3.996 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 10.394 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.041 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [01:01 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:06 min
[INFO] Finished at: 2018-05-23T17:47:45+08:00
[INFO] Final Memory: 255M/641M
[INFO] ------------------------------------------------------------------------
编译成功的压缩包在 /opt/sourcecode/hadoop-2.8.3-src/hadoop-dist/target/中的hadoop-2.8.3.tar.gz



**提示:

1、有时候编译过程中会出现下载某个包的时间太久,这是由于连接网站的过程中会出现假死,此时按ctrl+c,重新运行编译命令。 
2、如果出现缺少了某个文件的情况,则要先清理maven(使用命令 mvn clean) 再重新编译。





  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值