Hadoop2.6.4重新编译 64 位本地库
环境:虚拟机 VirtualBox,操作系统 64 位 CentOS 6.7
1,安装基本应用
yum -y install gcc gcc-c++
yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl –devel
yum -y install svn ncurses-devel libXtst
2,下载重新编译需要的软件包
apache-ant-1.9.6-bin.tar.gz
findbugs-3.0.1.tar.gz
protobuf-2.5.0.tar.gz
apache-maven-3.3.3-bin.tar.gz
下载 hadoop2.7.3 的源码包
hadoop-2.7.3-src.tar.gz
解压源码包
[grid@hadoopMaster01 ~]$ tar -zxvf hadoop-2.7.3-src.tar.gz
3,安装编译所需软件
(1)安装 MAVEN
压解 apache-maven-3.3.3-bin.tar.gz 到/opt/目录
[root@hadoopMaster01 grid]# tar -zxvf apache-maven-3.3.3-bin.tar.gz -C /opt/
修改/etc/profile 配置,增加 MAVEN 环境配置
vim /etc/profile
export MAVEN_HOME=/opt/apache-maven-3.3.3
export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin
保存后使用 source /etc/profile 使修改配置即时生效
[root@hadoopMaster01 apache-maven-3.0.5]# source /etc/profile
使用 mvn -v 命令进行验证,如图所示表示安装配置成功
修改配置文件
vim /opt/apache-maven-3.3.3/conf/settings.xml
---更改默认的Maven使用的jdk版本:
<profile>
<id>jdk-1.7</id>
<activation>
<activeByDefault>true</activeByDefault>
<jdk>1.7</jdk>
</activation>
<properties>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
<maven.compiler.compilerVersion>1.7</maven.compiler.compilerVersion>
</properties>
</profile>
----为了更高的速度,可以使用阿里云的仓库地址:
<mirror>
<id>nexus-aliyun</id>
<mirrorOf>*</mirrorOf>
<name>Nexus aliyun</name>
<url>http://maven.aliyun.com/nexus/content/groups/public</url>
</mirror>
---------------------------------------------
(2)安装 ANT
压解 apache-ant-1.9.6-bin.tar.gz 到/opt/目录
[root@hadoopMaster01 grid]# tar -zxvf apache-ant-1.9.6-bin.tar.gz -C /opt/
修改/etc/profile 配置,增加 ANT 环境配置
export MAVEN_HOME=/opt/apache-maven-3.3.3
export ANT_HOME=/opt/apache-ant-1.9.6
export HADOOP_HOME=/hadoop/app/hadoop-2.7.1
export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin
保存后使用 source /etc/profile 使修改配置即时生效
[root@hadoopMaster01 apache-ant-1.9.4]# source /etc/profile
使用 ant -version 命令进行验证,如图所示表示安装配置成功
(3)安装 FINDBUGS
压解 findbugs-3.0.1.tar.gz 到/opt/目录
[root@hadoopMaster01 grid]# tar -zxvf findbugs-3.0.1.tar.gz -C /opt/
修改/etc/profile 配置,增加 FINDBUGS 环境配置
export MAVEN_HOME=/opt/apache-maven-3.3.3
export ANT_HOME=/opt/apache-ant-1.9.6
export FINDBUGS_HOME=/opt/findbugs-3.0.1
export HADOOP_HOME=/hadoop/app/hadoop-2.7.1
export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin:$FINDBUGS_HOME/bin
保存后使用 source /etc/profile 使修改配置即时生效
[root@hadoopMaster01 apache-ant-1.9.4]# source /etc/profile
使用 findbugs -version 命令进行验证,如图所示表示安装配置成功
(4)安装 PROTOBUF(编译 Hadoop 2.7.3,需要 protobuf 的编译器protoc,一定要是 protobuf 2.5.0 以上)
--->解压缩
tar -zxvf protobuf-2.5.0.tar.gz
--->进入该目录
cd protobuf-2.5.0
--->运行检测
./configure
--->编译
make
--->安装
make install
--->检验是否安装成功
protoc --version
(出现libprotoc 2.5.0)
(6)编译 64 位本地库(编译hadoop时最高多给机器分配点内存,至少2G以上)
进入已压解的 hadoop 源码目录
[grid@hadoopMaster01 ~]$ cd /hadoop/app/hadoop-2.7.3-src
[grid@hadoopMaster01 hadoop-2.7.3-src]$ pwd
/hadoop/app/hadoop-2.7.3-src
依次执行 下面的 命令,等待完成(会自动联网下载很多东西)
[grid@hadoopMaster01 hadoop-2.7.3-src]$ mvn clean install -DskipTests
执行 mvn clean package -Pdist,native -DskipTests -Dtar 命令,开始编译,等待完成
grid@hadoopMaster01 hadoop-2.7.3-src]$ mvn clean package -Pdist,native -DskipTests -Dtar
出现如下信息
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 6.132 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 2.488 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 8.366 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 3.462 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.387 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 7.532 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 6.074 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 11.212 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 11.060 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 5.193 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [02:45 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 9.000 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 17.196 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.069 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [02:52 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 40.517 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 12.716 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 5.162 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.169 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.131 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:59 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 39.737 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.199 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 20.401 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 25.076 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 4.874 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 14.696 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 25.992 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 8.638 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 12.094 s]
[INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 6.517 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.167 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 4.141 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.289 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.080 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [ 8.001 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 4.792 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.293 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 29.094 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 23.380 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 6.339 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 16.128 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 9.231 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 6.078 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.791 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 9.425 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 3.541 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 6.571 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 11.545 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.658 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 8.210 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 6.718 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 4.130 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 2.701 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 4.133 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 8.266 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 7.647 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 6.851 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 6.629 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 8.927 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.318 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 8.397 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 11.662 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.066 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 48.961 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:30 min
[INFO] Finished at: 2016-11-18T19:35:48+08:00
[INFO] Final Memory: 115M/485M
[INFO] ------------------------------------------------------------------------
表示编译成功
进入/hadoop/app/hadoop-2.7.3-src/hadoop-dist/target/hadoop-2.7.3/lib/native 检查,
使用 file *命令
环境:虚拟机 VirtualBox,操作系统 64 位 CentOS 6.7
1,安装基本应用
yum -y install gcc gcc-c++
yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl –devel
yum -y install svn ncurses-devel libXtst
2,下载重新编译需要的软件包
apache-ant-1.9.6-bin.tar.gz
findbugs-3.0.1.tar.gz
protobuf-2.5.0.tar.gz
apache-maven-3.3.3-bin.tar.gz
下载 hadoop2.7.3 的源码包
hadoop-2.7.3-src.tar.gz
解压源码包
[grid@hadoopMaster01 ~]$ tar -zxvf hadoop-2.7.3-src.tar.gz
3,安装编译所需软件
(1)安装 MAVEN
压解 apache-maven-3.3.3-bin.tar.gz 到/opt/目录
[root@hadoopMaster01 grid]# tar -zxvf apache-maven-3.3.3-bin.tar.gz -C /opt/
修改/etc/profile 配置,增加 MAVEN 环境配置
vim /etc/profile
export MAVEN_HOME=/opt/apache-maven-3.3.3
export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin
保存后使用 source /etc/profile 使修改配置即时生效
[root@hadoopMaster01 apache-maven-3.0.5]# source /etc/profile
使用 mvn -v 命令进行验证,如图所示表示安装配置成功
修改配置文件
vim /opt/apache-maven-3.3.3/conf/settings.xml
---更改默认的Maven使用的jdk版本:
<profile>
<id>jdk-1.7</id>
<activation>
<activeByDefault>true</activeByDefault>
<jdk>1.7</jdk>
</activation>
<properties>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
<maven.compiler.compilerVersion>1.7</maven.compiler.compilerVersion>
</properties>
</profile>
----为了更高的速度,可以使用阿里云的仓库地址:
<mirror>
<id>nexus-aliyun</id>
<mirrorOf>*</mirrorOf>
<name>Nexus aliyun</name>
<url>http://maven.aliyun.com/nexus/content/groups/public</url>
</mirror>
---------------------------------------------
(2)安装 ANT
压解 apache-ant-1.9.6-bin.tar.gz 到/opt/目录
[root@hadoopMaster01 grid]# tar -zxvf apache-ant-1.9.6-bin.tar.gz -C /opt/
修改/etc/profile 配置,增加 ANT 环境配置
export MAVEN_HOME=/opt/apache-maven-3.3.3
export ANT_HOME=/opt/apache-ant-1.9.6
export HADOOP_HOME=/hadoop/app/hadoop-2.7.1
export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin
保存后使用 source /etc/profile 使修改配置即时生效
[root@hadoopMaster01 apache-ant-1.9.4]# source /etc/profile
使用 ant -version 命令进行验证,如图所示表示安装配置成功
(3)安装 FINDBUGS
压解 findbugs-3.0.1.tar.gz 到/opt/目录
[root@hadoopMaster01 grid]# tar -zxvf findbugs-3.0.1.tar.gz -C /opt/
修改/etc/profile 配置,增加 FINDBUGS 环境配置
export MAVEN_HOME=/opt/apache-maven-3.3.3
export ANT_HOME=/opt/apache-ant-1.9.6
export FINDBUGS_HOME=/opt/findbugs-3.0.1
export HADOOP_HOME=/hadoop/app/hadoop-2.7.1
export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin:$FINDBUGS_HOME/bin
保存后使用 source /etc/profile 使修改配置即时生效
[root@hadoopMaster01 apache-ant-1.9.4]# source /etc/profile
使用 findbugs -version 命令进行验证,如图所示表示安装配置成功
(4)安装 PROTOBUF(编译 Hadoop 2.7.3,需要 protobuf 的编译器protoc,一定要是 protobuf 2.5.0 以上)
--->解压缩
tar -zxvf protobuf-2.5.0.tar.gz
--->进入该目录
cd protobuf-2.5.0
--->运行检测
./configure
--->编译
make
--->安装
make install
--->检验是否安装成功
protoc --version
(出现libprotoc 2.5.0)
(6)编译 64 位本地库(编译hadoop时最高多给机器分配点内存,至少2G以上)
进入已压解的 hadoop 源码目录
[grid@hadoopMaster01 ~]$ cd /hadoop/app/hadoop-2.7.3-src
[grid@hadoopMaster01 hadoop-2.7.3-src]$ pwd
/hadoop/app/hadoop-2.7.3-src
依次执行 下面的 命令,等待完成(会自动联网下载很多东西)
[grid@hadoopMaster01 hadoop-2.7.3-src]$ mvn clean install -DskipTests
执行 mvn clean package -Pdist,native -DskipTests -Dtar 命令,开始编译,等待完成
grid@hadoopMaster01 hadoop-2.7.3-src]$ mvn clean package -Pdist,native -DskipTests -Dtar
出现如下信息
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 6.132 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 2.488 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 8.366 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 3.462 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.387 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 7.532 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 6.074 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 11.212 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 11.060 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 5.193 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [02:45 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 9.000 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 17.196 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.069 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [02:52 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 40.517 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 12.716 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 5.162 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.169 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.131 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:59 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 39.737 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.199 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 20.401 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 25.076 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 4.874 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 14.696 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 25.992 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 8.638 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 12.094 s]
[INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 6.517 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.167 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 4.141 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.289 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.080 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [ 8.001 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 4.792 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.293 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 29.094 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 23.380 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 6.339 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 16.128 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 9.231 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 6.078 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.791 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 9.425 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 3.541 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 6.571 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 11.545 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.658 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 8.210 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 6.718 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 4.130 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 2.701 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 4.133 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 8.266 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 7.647 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 6.851 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 6.629 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 8.927 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.318 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 8.397 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 11.662 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.066 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 48.961 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:30 min
[INFO] Finished at: 2016-11-18T19:35:48+08:00
[INFO] Final Memory: 115M/485M
[INFO] ------------------------------------------------------------------------
表示编译成功
进入/hadoop/app/hadoop-2.7.3-src/hadoop-dist/target/hadoop-2.7.3/lib/native 检查,
使用 file *命令