Hadoop 编译

1. Hadoop 源代码下载,上传,解压

下载:

Hadoop 官网:hadoop.apache.org, 下载hadoop 源代码包

上传:

使用 rz命令上传hadoop-2.8.1-src.tar.gz 源代码安装包

解压:

[root@hadoop001 sourcecode]# tar -xzvf hadoop-2.8.1-src.tar.gz
(解压到创建的文件路径:/opt/sourcecode)

2. Hadoop 源代码编译环境准备:

[root@hadoop001 hadoop-2.8.1-src]# cat BUILDING.txt
Build instructions for Hadoop


Requirements:

  • Unix System
  • JDK 1.7+
  • Maven 3.0 or later
  • Findbugs 1.3.9 (if running findbugs)
  • ProtocolBuffer 2.5.0
  • CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
  • Zlib devel (if compiling native code)
  • openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
  • Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
  • Internet connection for first build (to fetch all Maven and Hadoop dependencies)

3. 安装JAVA:

下载,上传,并在创建目录/usr/java 下解压java:

[root@hadoop001 ~]# rz #上传jdk-8u45-linux-x64.gz
[root@hadoop001 ~]# mkdir -p /usr/java
[root@hadoop001 ~]# mv jdk-8u45-linux-x64.gz /usr/java
[root@hadoop001 ~]# cd /usr/java
[root@hadoop001 ~]# tar -xzvf jdk-8u45-linux-x64.gz

修改java用户和用户组:

[root@hadoop002 java]# ll
total 169388
drwxr-xr-x. 8 uucp 143 4096 Apr 11 2015 jdk1.8.0_45 #解压后的java文件用户和用户组出现乱码
-rw-r–r–. 1 root root 173271626 Mar 16 15:25 jdk-8u45-linux-x64.gz
[root@hadoop002 java]# chown -R root:root jdk1.8.0_45
[root@hadoop002 java]# ll
total 169388
drwxr-xr-x. 8 root root 4096 Apr 11 2015 jdk1.8.0_45
-rw-r–r–. 1 root root 173271626 Mar 16 15:25 jdk-8u45-linux-x64.gz

配置java全局环境变量(路径):

[root@hadoop001 java]# vi /etc/profile

  1. export JAVA_HOME=/usr/java/jdk1.8.0_45
  2. export PATH=$JAVA_HOME/bin: (shift+4)PATH

注:shift+4=$

生效:
[root@hadoop001 java]# source /etc/profile
[root@hadoop001 java]# which java
/usr/java/jdk1.8.0_45/bin/java
[root@hadoop001 java]# java -version
java version “1.8.0_45”
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)

4. 安装maven:

在创建目录/opt/software下上传maven安装包:

[root@hadoop001 ~]# mkdir -p /opt/software

[root@hadoop001 ~]# cd /opt/software/
[root@hadoop001 software]# rz
rz waiting to receive.
Starting zmodem transfer. Press Ctrl+C to cancel.
Transferring apache-maven-3.3.9-bin.zip…
100% 8415 KB 8415 KB/sec 00:00:01 0 Errors

[root@hadoop001 software]# ll
total 8432
-rw-r–r–. 1 root root 8617253 Aug 20 12:35 apache-maven-3.3.9-bin.zip

解压,并配置maven全局环境变量:

[root@hadoop001 software]# unzip apache-maven-3.3.9-bin.zip

[root@hadoop001 java]# vi /etc/profile

  1. export MAVEN_HOME=/opt/software/apache-maven-3.3.9
  2. export MAVEN_OPTS=”-Xms256m -Xmx512m”
  3. export PATH=$MAVEN_HOME/bin:(shift+4)JAVA_HOME/bin:(shift+4)PATH

保存到环境变量中

查看版本:
[root@hadoop001 ~]# mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /opt/software/apache-maven-3.3.9
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: /usr/java/jdk1.8.0_45/jre
Default locale: en_US, platform encoding: UTF-8
OS name: “linux”, version: “2.6.32-431.el6.x86_64”, arch: “amd64”, family: “unix”

5. 安装protobuf:

上传,解压:

[root@hadoop001 software]# rz
rz waiting to receive.
Starting zmodem transfer. Press Ctrl+C to cancel.
Transferring protobuf-2.5.0.tar.gz…
100% 2345 KB 2345 KB/sec 00:00:01 0 Errors

[root@hadoop001 software]# tar -xzvf protobuf-2.5.0.tar.gz
[root@hadoop001 software]# ll
total 10792
drwxr-xr-x. 6 root root 4096 Nov 10 2015 apache-maven-3.3.9
-rw-r–r–. 1 root root 8617253 Aug 20 12:35 apache-maven-3.3.9-bin.zip
drwxr-xr-x. 10 109965 5000 4096 Feb 27 2013 protobuf-2.5.0
-rw-r–r–. 1 root root 2401901 Aug 20 13:03 protobuf-2.5.0.tar.gz

变更protobuf用户,用户组(也可以不变更)

[root@hadoop001 software]#chown -R root:root protobuf-2.5.0

其他程序安装:

[root@hadoop001 software]# cd protobuf-2.5.0
[root@hadoop001 protobuf-2.5.0]# yum install -y gcc gcc-c++ make cmake #已经安装cmake
[root@hadoop001 protobuf-2.5.0]# ./configure –prefix=/usr/local/protobuf
[root@hadoop001 protobuf-2.5.0]# make && make install #双羽化符,表依次执行命令

配置全局环境变量:

[root@hadoop001 java]# vi /etc/profile

  1. export PROTOC_HOME=/usr/local/protobuf
  2. export PATH=$PROTOC_HOME/bin:(shift+4)MAVEN_HOME/bin:(shift+4)JAVA_HOME/bin:(shift+4)PATH

生效:
[root@hadoop001 protobuf-2.5.0]# source /etc/profile

[root@hadoop001 protobuf-2.5.0]# protoc –version
libprotoc 2.5.0
[root@hadoop001 protobuf-2.5.0]#

6. Findbugs安装:

上传,解压:

[root@hadoop001 software]# rz
rz waiting to receive.
Starting zmodem transfer. Press Ctrl+C to cancel.
Transferring findbugs-1.3.9.zip…
100% 7369 KB 7369 KB/sec 00:00:01 0 Errors

[root@hadoop001 software]# unzip findbugs-1.3.9.zip

配置全局环境变量:

[root@hadoop002 software]# vi /etc/profile

  1. export FINDBUGS_HOME=/opt/software/findbugs-1.3.9
  2. export PATH=$FINDBUGS_HOME/bin:(shift+4)PROTOC_HOME/bin:(shift+4)MAVEN_HOME/bin:(shift+4)JAVA_HOME/bin:(shift+4)PATH

保存到环境变量中

生效:
[root@hadoop001 software]# source /etc/profile
[root@hadoop001 software]# findbugs -version
1.3.9
[root@hadoop001 software]#

7. 其他依赖的安装:

yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake

8. 编译:

[root@hadoop001 sourcecode]# cd hadoop-2.8.1-src
[root@hadoop001 hadoop-2.8.1-src]# mvn clean package -Pdist,native -DskipTests -Dtar
[INFO] — maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-dist —
[INFO] ————————————————————————
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main …………………………… SUCCESS [ 7.770 s]
[INFO] Apache Hadoop Build Tools …………………….. SUCCESS [ 4.990 s]
[INFO] Apache Hadoop Project POM …………………….. SUCCESS [ 4.920 s]
[INFO] Apache Hadoop Annotations …………………….. SUCCESS [ 12.936 s]
[INFO] Apache Hadoop Assemblies ……………………… SUCCESS [ 0.809 s]
[INFO] Apache Hadoop Project Dist POM ………………… SUCCESS [ 5.013 s]
[INFO] Apache Hadoop Maven Plugins …………………… SUCCESS [ 17.617 s]
[INFO] Apache Hadoop MiniKDC ………………………… SUCCESS [ 23.551 s]
[INFO] Apache Hadoop Auth …………………………… SUCCESS [ 26.352 s]
[INFO] Apache Hadoop Auth Examples …………………… SUCCESS [ 10.305 s]
[INFO] Apache Hadoop Common …………………………. SUCCESS [05:58 min]
[INFO] Apache Hadoop NFS ……………………………. SUCCESS [ 19.105 s]
[INFO] Apache Hadoop KMS ……………………………. SUCCESS [ 27.479 s]
[INFO] Apache Hadoop Common Project ………………….. SUCCESS [ 0.174 s]
[INFO] Apache Hadoop HDFS Client …………………….. SUCCESS [01:32 min]
[INFO] Apache Hadoop HDFS …………………………… SUCCESS [06:42 min]
[INFO] Apache Hadoop HDFS Native Client ………………. SUCCESS [ 20.370 s]
[INFO] Apache Hadoop HttpFS …………………………. SUCCESS [ 40.720 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ………….. SUCCESS [ 21.298 s]
[INFO] Apache Hadoop HDFS-NFS ……………………….. SUCCESS [ 11.818 s]
[INFO] Apache Hadoop HDFS Project ……………………. SUCCESS [ 0.148 s]
[INFO] Apache Hadoop YARN …………………………… SUCCESS [ 0.192 s]
[INFO] Apache Hadoop YARN API ……………………….. SUCCESS [ 41.517 s]
[INFO] Apache Hadoop YARN Common …………………….. SUCCESS [01:19 min]
[INFO] Apache Hadoop YARN Server …………………….. SUCCESS [ 0.192 s]
[INFO] Apache Hadoop YARN Server Common ………………. SUCCESS [ 19.421 s]
[INFO] Apache Hadoop YARN NodeManager ………………… SUCCESS [ 42.398 s]
[INFO] Apache Hadoop YARN Web Proxy ………………….. SUCCESS [ 8.925 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ……. SUCCESS [ 16.120 s]
[INFO] Apache Hadoop YARN ResourceManager …………….. SUCCESS [ 57.415 s]
[INFO] Apache Hadoop YARN Server Tests ……………….. SUCCESS [ 3.869 s]
[INFO] Apache Hadoop YARN Client …………………….. SUCCESS [ 14.325 s]
[INFO] Apache Hadoop YARN SharedCacheManager ………….. SUCCESS [ 11.814 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ……… SUCCESS [ 10.027 s]
[INFO] Apache Hadoop YARN Applications ……………….. SUCCESS [ 0.276 s]
[INFO] Apache Hadoop YARN DistributedShell ……………. SUCCESS [ 8.333 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ……….. SUCCESS [ 5.473 s]
[INFO] Apache Hadoop YARN Site ………………………. SUCCESS [ 0.160 s]
[INFO] Apache Hadoop YARN Registry …………………… SUCCESS [ 13.204 s]
[INFO] Apache Hadoop YARN Project ……………………. SUCCESS [ 8.106 s]
[INFO] Apache Hadoop MapReduce Client ………………… SUCCESS [ 0.514 s]
[INFO] Apache Hadoop MapReduce Core ………………….. SUCCESS [01:09 min]
[INFO] Apache Hadoop MapReduce Common ………………… SUCCESS [ 40.479 s]
[INFO] Apache Hadoop MapReduce Shuffle ……………….. SUCCESS [ 10.304 s]
[INFO] Apache Hadoop MapReduce App …………………… SUCCESS [ 27.335 s]
[INFO] Apache Hadoop MapReduce HistoryServer ………….. SUCCESS [ 19.910 s]
[INFO] Apache Hadoop MapReduce JobClient ……………… SUCCESS [ 16.657 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins …… SUCCESS [ 4.591 s]
[INFO] Apache Hadoop MapReduce Examples ………………. SUCCESS [ 12.346 s]
[INFO] Apache Hadoop MapReduce ………………………. SUCCESS [ 5.966 s]
[INFO] Apache Hadoop MapReduce Streaming ……………… SUCCESS [ 7.940 s]
[INFO] Apache Hadoop Distributed Copy ………………… SUCCESS [ 15.245 s]
[INFO] Apache Hadoop Archives ……………………….. SUCCESS [ 5.380 s]
[INFO] Apache Hadoop Archive Logs ……………………. SUCCESS [ 5.812 s]
[INFO] Apache Hadoop Rumen ………………………….. SUCCESS [ 11.785 s]
[INFO] Apache Hadoop Gridmix ………………………… SUCCESS [ 9.890 s]
[INFO] Apache Hadoop Data Join ………………………. SUCCESS [ 5.784 s]
[INFO] Apache Hadoop Ant Tasks ………………………. SUCCESS [ 3.254 s]
[INFO] Apache Hadoop Extras …………………………. SUCCESS [ 5.495 s]
[INFO] Apache Hadoop Pipes ………………………….. SUCCESS [ 10.630 s]
[INFO] Apache Hadoop OpenStack support ……………….. SUCCESS [ 11.234 s]
[INFO] Apache Hadoop Amazon Web Services support ………. SUCCESS [ 14.060 s]
[INFO] Apache Hadoop Azure support …………………… SUCCESS [ 10.535 s]
[INFO] Apache Hadoop Client …………………………. SUCCESS [ 13.519 s]
[INFO] Apache Hadoop Mini-Cluster ……………………. SUCCESS [ 2.164 s]
[INFO] Apache Hadoop Scheduler Load Simulator …………. SUCCESS [ 10.405 s]
[INFO] Apache Hadoop Tools Dist ……………………… SUCCESS [ 11.514 s]
[INFO] Apache Hadoop Azure Data Lake support ………….. SUCCESS [ 9.201 s]
[INFO] Apache Hadoop Tools ………………………….. SUCCESS [ 0.129 s]
[INFO] Apache Hadoop Distribution ……………………. SUCCESS [01:07 min]
[INFO] ————————————————————————
[INFO] BUILD SUCCESS
[INFO] ————————————————————————
[INFO] Total time: 31:41 min
[INFO] Finished at: 2017-12-10T11:55:28+08:00
[INFO] Final Memory: 166M/494M
[INFO] ————————————————————————
[root@rzdatahadoop001 hadoop-2.8.1-src]#

9. 编译完成:

生成hadoop-2.8.1.tar.gz
路径:/opt/sourcecode/hadoop-2.8.1-src/hadoop-dist/target/hadoop-2.8.1.tar.gz
若出现缺少某些文件的错误提醒,使用mvn clean命令,再重新编译

来自@若泽大数据

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值