hadoop2.6.0预编译版本现在都只是在64位上面编译的,如果在32位系统上运行,会弹出警告,无法加载本地库(native library),启动和使用hadoop命令是总是提示该警告,强迫症患者就受不了,想着就折腾一下,在32位虚拟机上编译hadoop-2.6.0源码。
2、准备工作
0:前提是安装了jdk1.7版本并设置了环境变量,java环境变量都一个套路:jdk,maven,hadoop,spark,等等都是类似的设置:
(
export XX_home=path/to/XX
export PATH=$XX_HOME/bin:$PATH
)
a.安装依赖
#yum install gcc gcc-c++ cmake openssl-devel ncurses-devel
b.下载protobuf-2.5.0.tar.gz,并安装
tar -xzvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure
make && make install
验证安装:protoc --version
[feiy@server ~]$ protoc --version
libprotoc 2.5.0
c.下载maven3.3.9,解压并设置环境变量
unzip apache-maven-3.3.9-bin.zip
export MAVEN_HOME=/path/to/maven
export PATH=$MAVEN_HOME/bin:$PATH
----------------------------------------------------------------
source /etc/profile 或者reboot(有点笨的办法)
验证:mvn -v
[feiy@server ~]$ mvn -v
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /opt/maven3
Java version: 1.7.0_79, vendor: Oracle Corporation
Java home: /usr/java/jdk1.7.0_79/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-573.el6.i686", arch: "i386", family: "unix"
d.下载hadoop-2.6.0-src.tar.gz并解压
tar -xzvf hadoop-2.6.0-src.tar.gz
3、编译
mvn package -DskipTests -Pdist,native -Dtar
编译过程需要下载很多pom和jar文件,顺利的话半个小时,不顺利两三个小时,我使用默认的maven库,没使用http://maven.oschina.net的库,因为当时连不上。中途还出现The system is out of resources 等问题,最后调大虚拟机内存至1G 才顺利编译完成。
最后编译完成的文件在 hadoop-dist/target下
[feiy@server target]$ pwd
/opt/hadoop-2.6.0-src/hadoop-dist/target
[feiy@server target]$ ls
antrun hadoop-2.6.0 hadoop-dist-2.6.0-javadoc.jar test-dir
dist-layout-stitching.sh hadoop-2.6.0.tar.gz javadoc-bundle-options
dist-tar-stitching.sh hadoop-dist-2.6.0.jar maven-archiver
查看native文件
[feiy@server native]$ ls
libhadoop.a libhadoop.so libhadooputils.a libhdfs.so
libhadooppipes.a libhadoop.so.1.0.0 libhdfs.a libhdfs.so.0.0.0
[feiy@server native]$ file libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 32-bit LSB shared object, Intel 80386, version 1 (SYSV), dynamically linked, not stripped