Hadoop 编译

Hadoop Deploy

下载介质

安装系统编译环境

安装JDK

解压配置环境变量即可使用

export JAVA_HOME=/opt/jdk1.8.0_131
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$JAVA_HOME/bin:$PATH

#自JDK1.7之后 CLASSPATH 可以不配置

安装系统相关依赖

yum -y install autoconf automake libtool cmake ncurses-devel openssl-devel gcc gcc-c++

安装maven

tar -zxvf apache-maven-3.3.9-bin.tar.gz -C /opt/
vi /etc/profile
export MAVEN_HOME=/opt/apache-maven-3.3.9
export PATH=$PATH:$MAVEN_HOME/bin
source /etc/profile

[root@localhost lib]# mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /usr/local/apache-maven-3.3.9
Java version: 1.7.0_79, vendor: Oracle Corporation
Java home: /usr/local/jdk/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"

指定阿里云的maven仓库,会大大提高包的下载速度,毕竟中央仓库在国外
设置maven的settings.xml 
<mirror>
    <id>nexus-aliyun</id>
    <mirrorOf>*</mirrorOf>
    <name>Nexusaliyun</name>
    <url>http://maven.aliyun.com/nexus/content/groups/public</url>
</mirror>

安装ant

tar -zxvf apache-ant-1.10.1-bin.tar.gz -C /opt/

vi /etc/profile
export ANT_HOME=/opt/apache-ant-1.10.1
export PATH=$PATH:$ANT_HOME/bin
source /etc/profile

[root@localhost ~]# ant -version
Apache Ant(TM) version 1.10.1 compiled on February 2 2017

安装protobuf-2.5.0

tar -zxvf protobuf-2.5.0.tar.gz -C /opt/
cd /opt/protobuf-2.5.0
./configure

make
make check
make install

protoc --version

设置内存

在编译之前防止 java.lang.OutOfMemoryError: Java heap space   堆栈问题,在centos系统中执行命令:
$ export MAVEN_OPTS="-Xms256m -Xmx512m"

编译hadoop

解压源码包并执行打包命令

tar -zxvf hadoop-2.7.5-src.tar.gz -C /opt/
cd /opt/hadoop-2.7.5-src/
mvn package -Pdist,native -DskipTests -Dtar

......
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 29:36 min
[INFO] Finished at: 2018-01-19T02:04:55+08:00
[INFO] Final Memory: 101M/237M
[INFO] ------------------------------------------------------------------------

在/opt/hadoop-2.7.5-src/hadoop-dist目录下生成了target目录,编译好的hadoop-2.7.5.tar.gz安装包就在这个目录下

验证编译是否成功

cd /app/compile/hadoop-2.7.5-src/hadoop-dist/target/hadoop-2.7.5/lib/native

[root@cyyun native]# file *
libhadoop.a:        current ar archive
libhadooppipes.a:   current ar archive
libhadoop.so:       symbolic link to `libhadoop.so.1.0.0'
libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
libhadooputils.a:   current ar archive
libhdfs.a:          current ar archive
libhdfs.so:         symbolic link to `libhdfs.so.0.0.0'
libhdfs.so.0.0.0:   ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped

目录中查看libhadoop.so.1.0.0属性,该文件为ELF 64-bit LSB则表示文件成功编译为64位

如果之前已经安装好没有编译过的Hadoop版本,将编译后的/opt/hadoop-2.7.5-src/hadoop-dist/target/hadoop-2.7.5/lib/native目录下的所有文件拷贝到Hadoop对应安装目录文件下即可。

转载于:https://my.oschina.net/epoch/blog/1611191

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值