CentOS7.3编译Hadoop2.7.4

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/gavinguo1987/article/details/77744150


一、  背景

Hadoop官网提供的编译包如hadoop-2.7.4.tar.gz为Linux32位环境编译,其中native library在64位环境中运行会出现异常。因而在64位生产环境中使用的Hadoop是需要下载源代码并在64位系统中进行成功编译后的版本。

 

二、  环境

CentOS Linux release 7.3.1611

JDK 1.8_144

MAVEN 3.5.0

 

三、  编译软件依赖

详细依赖见src/ BUILDING.txt

gcc

GNU Autools Chain: autoconf, automake, libtool

cmake

snappy

gzip

bzip2

protobuf (使用protobuf2.5)

zlib

Openssl

findbugs

  

   安装protobuf

cd /usr/local

tar -zxvf protobuf-2.5.0.tar.gz

cd protobuf-2.5.0

./configure --prefix=/usr/local/protoc/

make && make install

vi /etc/profile

PROTOC_HOME=/usr/local/protoc/

PATH=$PROTOC_HOME/bin:$PATH

export PATH

/etc/profile

protoc --version

 

   安装findbugs

cd /usr/local

tar -zxvf findbugs-3.0.1.tar.gz

vi /etc/profile

FINDBUGS_HOME=/usr/local/findbugs-3.0.1/

PATH=$FINDBUGS_HOME/bin:$PATH

export PATH

source /etc/profile

findbugs -version

 

   安装其他软件

yum install gcc autoconf automake libtool cmake snappy gzip bzip2 zlib openssl

 

   安装完毕后可以再次执行命令验证,结果如下

已加载插件:fastestmirror, langpacks

Loading mirror speeds from cached hostfile

 * base: mirrors.163.com

 * extras: mirrors.163.com

 * updates: mirrors.163.com

软件包 gcc-4.8.5-11.el7.x86_64 已安装并且是最新版本

软件包 autoconf-2.69-11.el7.noarch 已安装并且是最新版本

软件包 automake-1.13.4-3.el7.noarch 已安装并且是最新版本

软件包 libtool-2.4.2-22.el7_3.x86_64 已安装并且是最新版本

软件包 cmake-2.8.12.2-2.el7.x86_64 已安装并且是最新版本

软件包 snappy-1.1.0-3.el7.x86_64 已安装并且是最新版本

软件包 gzip-1.5-8.el7.x86_64 已安装并且是最新版本

软件包 bzip2-1.0.6-13.el7.x86_64 已安装并且是最新版本

软件包 zlib-1.2.7-17.el7.x86_64 已安装并且是最新版本

软件包 1:openssl-1.0.1e-60.el7_3.1.x86_64 已安装并且是最新版本

无须任何处理

 

四、  编译Hadoop

1. 下载源代码并解压

https://archive.apache.org/dist/hadoop/common/hadoop-2.7.4/hadoop-2.7.4-src.tar.gz

将下载包放置到 /usr/local

cd /usr/local

tar -zxvf hadoop-2.7.4-src.tar.gz

 

2. 使用mvn编译打包

cd /usr/local/hadoop-2.7.4-src

mvn package -e -X -Pdist,native -DskipTests -Dtar

编译大约持续20分钟

 

3. 安装成功日志

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main ................................. SUCCESS [  3.096 s]

[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.402 s]

[INFO] Apache Hadoop Project POM .......................... SUCCESS [  2.533 s]

[INFO] Apache Hadoop Annotations .......................... SUCCESS [  5.534 s]

[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.604 s]

[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  3.148 s]

[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  8.503 s]

[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 12.049 s]

[INFO] Apache Hadoop Auth ................................. SUCCESS [ 12.412 s]

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  8.417 s]

[INFO] Apache Hadoop Common ............................... SUCCESS [02:22 min]

[INFO] Apache Hadoop NFS .................................. SUCCESS [ 12.718 s]

[INFO] Apache Hadoop KMS .................................. SUCCESS [ 43.136 s]

[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.140 s]

[INFO] Apache Hadoop HDFS ................................. SUCCESS [04:29 min]

[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 39.289 s]

[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 29.048 s]

[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  9.293 s]

[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.114 s]

[INFO] hadoop-yarn ........................................ SUCCESS [  0.128 s]

[INFO] hadoop-yarn-api .................................... SUCCESS [01:17 min]

[INFO] hadoop-yarn-common ................................. SUCCESS [02:07 min]

[INFO] hadoop-yarn-server ................................. SUCCESS [  0.121 s]

[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 22.825 s]

[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 34.918 s]

[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  8.009 s]

[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 15.709 s]

[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 40.880 s]

[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 10.857 s]

[INFO] hadoop-yarn-client ................................. SUCCESS [ 13.076 s]

[INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [  9.424 s]

[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.135 s]

[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  6.902 s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  4.778 s]

[INFO] hadoop-yarn-site ................................... SUCCESS [  0.139 s]

[INFO] hadoop-yarn-registry ............................... SUCCESS [ 12.213 s]

[INFO] hadoop-yarn-project ................................ SUCCESS [  7.393 s]

[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.444 s]

[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 38.882 s]

[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 36.858 s]

[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  9.322 s]

[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 17.605 s]

[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 11.796 s]

[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 19.967 s]

[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  4.212 s]

[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 11.684 s]

[INFO] hadoop-mapreduce ................................... SUCCESS [  5.232 s]

[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 12.640 s]

[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 15.882 s]

[INFO] Apache Hadoop Archives ............................. SUCCESS [  4.802 s]

[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 10.899 s]

[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  9.432 s]

[INFO] Apache Hadoop Data Join ............................ SUCCESS [  5.211 s]

[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  4.540 s]

[INFO] Apache Hadoop Extras ............................... SUCCESS [  6.042 s]

[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 11.415 s]

[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  9.638 s]

[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [01:00 min]

[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 16.567 s]

[INFO] Apache Hadoop Client ............................... SUCCESS [ 13.889 s]

[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  2.306 s]

[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 11.684 s]

[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 13.586 s]

[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.073 s]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 56.358 s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 23:42 min

[INFO] Finished at: 2017-08-30T23:44:38+08:00

[INFO] Final Memory: 210M/651M

[INFO] ------------------------------------------------------------------------

 

4. 获取编译后工程

编译成功的工程在如下路径

/usr/local/hadoop-2.7.4-src/hadoop-dist/target/hadoop-2.7.4

 

编译成功的tar.gz在如下路径

/usr/local/hadoop-2.7.4-src/hadoop-dist/target/hadoop-2.7.4.tar.gz

该tar.gz包即为可用64位安装包

 

编译成功的native包路径

/usr/local/hadoop-2.7.4-src/hadoop-dist/target/hadoop-2.7.4/lib/native

 

可仅将编译后的native文件夹替换32位Hadoop工程中的native文件夹

cp -r /usr/local/hadoop-2.7.4-src/hadoop-dist/target/hadoop-2.7.4/lib/native /home/hadoop/lib

单独替换native包需要设置权限

 


 

5. 验证

/home/hadoop/hadoop2.7/bin

./hadoop checknative -a

 

 

阅读更多
换一批

没有更多推荐了,返回首页