CentOS 7 编译64位Hadoop2.7.1

官网只提供Hadoop安装包是x86,32位的,64位需要自己编译。
编译的过程很坎坷,白天没时间,折腾了两三个晚上,中间出现各种状况。我把遇到的问题写下来供网友参考。有不对的地方欢迎纠正。

这是我编译好的 百度网盘下载 可以下载来看看hadoop2.7.1
http://pan.baidu.com/s/1pJkmqoj

一、运行环境

centos 7 64位需要联网

二、安装编译依赖包

yum -y install  svn   ncurses-devel   gcc*  
yum -y install lzo-devel zlib-devel autoconf    automake    libtool    cmake     openssl –devel 

安装asterisk
安装asterisk先决条件:

1.检查是否已经安装kernel-devel

rpm -q kernel-devel

例如:kernel-devel-2.6.18-92.1.10.el5,表示已经安装,如果没有安装,要用yum安装一下,

如下 yum install kernel-devel

2.检查是否已经安装以下的辅助软件包

rpm -q bison bison-devel ncurses ncurses-devel zlib zlib-devel openssl openssl-devel gnutls-devel gcc gcc-c++ mysql-devel

最少mysql-devel需要安装

3.如果没有安装则用yum安装

yum install bison
yum install bison-devel
yum install ncurses
yum install ncurses-devel
yum install zlib
yum install zlib-devel
yum install openssl
yum install openssl-devel
yum install gnutls-devel
yum install gcc
yum install gcc-c++

4.其他依赖软件安装

Sudo yum install gcc gcc-c++ make wget subversion libxml2-devel ncurses-devel openssl-devel vim-enhanced

从网上下载asterisk源代码

先创建文件夹存放文件

$ mkdir -p ~/src/asterisk-complete/asterisk

两种方法下载源代码

Svn:

$ cd ~/src/asterisk-complete/asterisk
$ svn co http://svn.asterisk.org/svn/asterisk/branches/1.8

Wget:

$ cd ~/src/asterisk-complete/asterisk

$ wget http://downloads.asterisk.org/pub/telephony/asterisk/asterisk-1.8-current.tar.gz

$ tar zxvf asterisk-1.8-current.tar.gz

我使用的是SVN方法

安装asterisk

1.安装libpri

$ cd ~/src/asterisk-complete/
$ mkdir libpri
$ cd libpri/
$ svn co http://svn.asterisk.org/svn/libpri/tags/1.4.11.5(按照svn上的最新版本填写http://downloads.asterisk.org/pub/telephony/libpri/)
$ cd 1.4.11.5
$ make
$ sudo make install

2.安装DAHDI

$ cd ~/src/asterisk-complete/
$ mkdir dahdi
$ cd dahdi/
$ svn co http://svn.asterisk.org/svn/dahdi/linux-complete/tags/2.4.1.2+2.4.1(按照svn上的最新版本填写http://downloads.asterisk.org/pub/telephony/dahdi-linux-complete/)
$ cd 2.4.1.2+2.4.1
$ make
$ sudo make install
$ sudo make config

3.安装asterisk

$ cd ~/src/asterisk-complete/asterisk/1.8
$ ./configure
$ make
$ sudo make install
$ sudo make config

4.安装文档

Make progdocs

5.安装示例配置文件

Make samples

6.关闭SELinux

$ cd /etc/selinux/
$ sudo vim config
$ sudo reboot

三、安装JDK

最新的版本是1.8 刚开始我装的也是1.8,到后面出现了编译不不去,改用1.7版本。建议大家直接装1.7的吧去官网下载

tar -xzvf jdk-7u80-linux-x64.gz

解压后把他移动到 usr目录下配置环境变量

vim /etc/profile  
添加下面代码
export JAVA_HOME=/usr/java/jdk1.7/
export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/jre/lib
export PATH=$PATH:$JAVA_HOME/bin:$JAVA_HOME/jre/bin

使环境生效

source /etc/profile 

检验JAVA环境是否生效

java -version  

四、安装protobuf-2.5.0.tar.gz

到网上下载

1、tar zxvf protobuf-2.5.0.tar.gz 
2、cd protobuf-2.5.0  
3、./configure  
4、make  
5、make install  

检验是否安装成功

protoc --version 

五、安装maven

下载apache- maven- 3.2.3- bin.tar.gz
解压缩

tar   zxvf   apache- maven- 3.2.3- bin.tar.gz  

配置环境变量 /etc/profile

export MAVEN_HOME=/usr/local/program/maven/ apache- maven- 3.2.3  
export PATH=$PATH:$MAVEN_HOME/bin   

使环境变量生效

source /etc/profile  

检验是否安装成功

mvn -version 

六、安装ant

下载apache-ant-1.9.4-bin.tar.gz
解压缩
添加环境变量/etc/profile

1、vi /etc/profile
export ANT_HOME=/home/joywang/apache-ant-1.9.4  
export PATH=$PATH:$ANT_HOME/
2source /etc/profile  
3、ant -version  

七、编译Hadoop

前面都是准备工作,从这里开始才是
到官网下载hadoop-2.7.1-src.tar

tar -xzvf  hadoop-2.7.1-src.tar
cd  hadoop-2.7.1
mvn clean package –Pdist,native –DskipTests –Dtar  

请耐心等待 这里会下载好多依赖包直到提示编译成功

编译好的文件放在/hadoop-dist/target/hadoop-2.7.1.tar.gz

八、编译Hadoop编码异常总结

这些错误,有的是我亲身遇到的,结合网上找的资料我就一起贴出来了

错误1:

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc version is 'libprotoc 2.4.1', expected version is '2.5.0' -> [Help 1]  
[ERROR]  
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.  
[ERROR] Re-run Maven using the -X switch to enable full debug logging.  
[ERROR]  
[ERROR] For more information about the errors and possible solutions, please read the following articles:  
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException  
[ERROR]  
[ERROR] After correcting the problems, you can resume the build with the command  
[ERROR] mvn <goals> -rf :hadoop-common 
安装protoc
wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz

(此处下载https://code.google.com/p/protobuf/downloads/list

解压, 进入根目录执行

sudo ./configure --prefix=/usr

若安装报错:

cpp: error trying to exec 'cc1plus': execvp: No such file or directory

则安装g++

sudo apt-get install g++

sudo make

sudo make check

sudo make install

protoc --version

错误2:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-  

plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has  

occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in  

directory "/home/wyf/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-  

common/target/native"): java.io.IOException: error=2, No such file or directory  

-> [Help 1]  
[ERROR]  
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e  

switch.  
[ERROR] Re-run Maven using the -X switch to enable full debug logging.  
[ERROR]  
[ERROR] For more information about the errors and possible solutions, please  

read the following articles:  
[ERROR] [Help 1]  

http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException 

安装Cmake
sudo apt-get
install cmake

错误3:

ERROR] Failed to execute goal org.codehaus.mojo.jspc:jspc-maven-plugin:2.0-  

alpha-3:compile (hdfs) on project hadoop-hdfs: Execution hdfs of goal  

org.codehaus.mojo.jspc:jspc-maven-plugin:2.0-alpha-3:compile failed: Plugin  

org.codehaus.mojo.jspc:jspc-maven-plugin:2.0-alpha-3 or one of its dependencies  

could not be resolved: Could not transfer artifact ant:ant:jar:1.6.5 from/to  

central (http://repo.maven.apache.org/maven2): GET request of:  

ant/ant/1.6.5/ant-1.6.5.jar from central failed: Read timed out -> [Help 1]  
[ERROR]  
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e  

switch.  
[ERROR] Re-run Maven using the -X switch to enable full debug logging.  
[ERROR]  
[ERROR] For more information about the errors and possible solutions, please  

read the following articles:  
[ERROR] [Help 1]  

http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException  
[ERROR]  
[ERROR] After correcting the problems, you can resume the build with the command  
[ERROR]   mvn <goals> -rf :hadoop-hdfs 
安装ant

1.首先下载ant

apache-ant-1.9.4-bin.tar.gz

2.解压

tar zxvf apache-ant-1.9.4-bin.tar.gz

3.配置环境变量

vim ~/.bashrc

export ANT_HOME=/home/xxl/apache-ant-1.9.4

export PATH=$ANT_HOME:$PATH

source ~/.bashrc

错误4:

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.4.0:prot  
oc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecut  
ionException: 'protoc --version' did not return a version -> [Help 1]  
[ERROR]  
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e swit  
ch.  
[ERROR] Re-run Maven using the -X switch to enable full debug logging.  
[ERROR]  
[ERROR] For more information about the errors and possible solutions, please rea  
d the following articles:  
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE  
xception  
[ERROR]  
[ERROR] After correcting the problems, you can resume the build with the command  

[ERROR]   mvn <goals> -rf :hadoop-common 

protobuf版本过低
安装2.5版本的即可

错误5:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned: 1 -> [Help 1]  
[ERROR]  
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.  
[ERROR] Re-run Maven using the -X switch to enable full debug logging.  

安装zlib-devel
ubuntu安装是
sudo apt-get install zlib1g-dev
安装zlib-devel

ubuntu安装是

sudo apt-get install zlib1g-dev

错误6:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1  
[ERROR] around Ant part ...<exec dir="/home/xxl/hadoop-2.5.2-src/hadoop-tools/hadoop-pipes/target/native" executable="cmake" failonerror="true">... @ 5:120 in /home/xxl/hadoop-2.5.2-src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml  
[ERROR] -> [Help 1] 

安装:
sudo apt-get install libssl-dev

错误7:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (tar) on project hadoop-dist: An Ant BuildException has occured: exec returned: 1  
[ERROR] around Ant part ...<exec dir="/home/xxl/hadoop-2.5.2-src/hadoop-dist/target" executable="sh" failonerror="true">... @ 21:96 in /home/xxl/hadoop-2.5.2-src/hadoop-dist/target/antrun/build-main.xml 

安装:

sudo apt-get install build-essential

sudo apt-get install libglib2.0-dev
  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 2
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值