hadoop 2.4.1 部署--1 编译安装

hadoop2.4.1 centos 编译安装详解(参考网上别人的改造下)


1、安装JDK(没写自己参考百度)
   
a.在/etc/profile.d/建立一个文件java.sh


内容如下:


#set java environment
JAVA_HOME=/usr/java/jdk1.7.0_60
CLASSPATH=.:$JAVA_HOME/lib.tools.jar
PATH=$JAVA_HOME/bin:$PATH
export JAVA_HOME CLASSPATH PATH


b. source /etc/profile 


c. 测试下JDK是否安装成功: java -version






2、编译前的准备(maven)






a. maven官方下载地址,可以选择源码编码安装,这里就直接下载编译好的 就可以了


cd /opt
wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.1.1/binaries/apache-maven-3.1.1-bin.zip
unzip  apache-maven-3.1.1-bin.zip
mv apache-maven-3.1.1 maven3.1.1




b.在/etc/profile.d/建立一个文件maven.sh


export MAVEN_HOME=/opt/maven3.2.2
export PATH=$PATH:$MAVEN_HOME/bin  


c. source /etc/profile 




d. 测试下MVN是否安装成功: mvn -version




3、编译hadoop


a.官方下载hadoop源码


wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.4.1/hadoop-2.4.1-src.tar.gz 




b.由于maven国外服务器可能连不上,先给maven配置一下国内镜像,在maven目录下,conf/settings.xml,
在<mirrors></mirros>里添加,原本的不要动




<mirror>  
  <id>nexus-osc</id>  
  <mirrorOf>*</mirrorOf>  
  <name>Nexusosc</name>  
  <url>http://maven.oschina.net/content/groups/public/</url>  
</mirror> 






c,在<profiles></profiles>内新添加


<profile>  
       <id>jdk-1.7</id>  
       <activation>  
         <jdk>1.4</jdk>  
       </activation>  
       <repositories>  
         <repository>  
           <id>nexus</id>  
           <name>local private nexus</name>  
           <url>http://maven.oschina.net/content/groups/public/</url>  
           <releases>  
             <enabled>true</enabled>  
           </releases>  
           <snapshots>  
             <enabled>false</enabled>  
           </snapshots>  
         </repository>  
       </repositories>  
       <pluginRepositories>  
         <pluginRepository>  
           <id>nexus</id>  
          <name>local private nexus</name>  
           <url>http://maven.oschina.net/content/groups/public/</url>  
           <releases>  
             <enabled>true</enabled>  
           </releases>  
           <snapshots>  
             <enabled>false</enabled>  
           </snapshots>  
         </pluginRepository>  
       </pluginRepositories>  
</profile>  




d.编译clean


cd /opt/hadoop-2.4.1-src
mvn clean install  






发现异常


[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]  
[ERROR]   
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.  
[ERROR] Re-run Maven using the -X switch to enable full debug logging.  
[ERROR]   
[ERROR] For more information about the errors and possible solutions, please read the following articles:  
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException  
[ERROR]   
[ERROR] After correcting the problems, you can resume the build with the command  
[ERROR]   mvn <goals> -rf :hadoop-common  


e.hadoop2.4.1编译需要protoc2.5.0的支持,所以还要下载protoc,


下载地址:https://code.google.com/p/protobuf/downloads/list,要下载2.5.0版本噢
对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 如果已经安装的可以忽略


yum install gcc  
yum intall gcc-c++  
yum install make  




f.安装protoc
 
tar -xvf protobuf-2.5.0.tar.bz2  
cd protobuf-2.5.0  
./configure --prefix=/opt/protoc/  
make && make install  




在/etc/profile.d/建立一个文件 protoc.sh


export PROTOC_HOME=/opt/protoc/   
export PATH=$PATH:$PROTOC_HOME/bin


查看是否安装成功:protoc --version


g.


cd /opt/hadoop2.4.1-src  
mvn clean install  
  






再次遇到报错[ERROR] class file for org.mortbay.component.AbstractLifeCycle not found


这次是遇到BUG了按照https://issues.apache.org/jira/browse/HADOOP-10110官方说明在
hadoop-common-project/hadoop-auth/pom.xml文件中添加


<dependency>  
   <groupId>org.mortbay.jetty</groupId>  
   <artifactId>jetty-util</artifactId>  
   <scope>test</scope>  
</dependency>  






h. 如果遇到下面的 错误


[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-common: There are test failures.
[ERROR] 
[ERROR] Please refer to /opt/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common








一是命令行,
mvn clean package -Dmaven.test.skip=true


二是写入pom文件(我用的是这种)
<plugin>  
        <groupId>org.apache.maven.plugins</groupId>  
        <artifactId>maven-surefire-plugin</artifactId>  
        <version>2.4.2</version>  
        <configuration>  
          <skipTests>true</skipTests>  
        </configuration>  
</plugin>  














i. 如果遇到下面的 错误


再次编译遇到报错Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common:


这是没有安装zlib1g-dev的关系,这个可以 使用apt-get安装




别急,还不要着急开始编译安装,不然又是各种错误,需要安装cmake,openssl-devel,ncurses-devel依赖
yum install cmake  
yum install openssl-devel  
yum install ncurses-devel  


J. 现在可以进行编译了,


mvn package -Pdist,native -DskipTests -Dtar  


(时间很长,大概10-20分钟)


[INFO] ------------------------------------------------------------------------  
[INFO] Reactor Summary:  
[INFO]   
[INFO] Apache Hadoop Main ................................ SUCCESS [3.709s]  
[INFO] Apache Hadoop Project POM ......................... SUCCESS [2.229s]  
[INFO] Apache Hadoop Annotations ......................... SUCCESS [5.270s]  
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.388s]  
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [3.485s]  
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.655s]  
[INFO] Apache Hadoop Auth ................................ SUCCESS [7.782s]  
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [5.731s]  
[INFO] Apache Hadoop Common .............................. SUCCESS [1:52.476s]  
[INFO] Apache Hadoop NFS ................................. SUCCESS [9.935s]  
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.110s]  
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:58.347s]  
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [26.915s]  
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [17.002s]  
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [5.292s]  
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.073s]  
[INFO] hadoop-yarn ....................................... SUCCESS [0.335s]  
[INFO] hadoop-yarn-api ................................... SUCCESS [54.478s]  
[INFO] hadoop-yarn-common ................................ SUCCESS [39.215s]  
[INFO] hadoop-yarn-server ................................ SUCCESS [0.241s]  
[INFO] hadoop-yarn-server-common ......................... SUCCESS [15.601s]  
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [21.566s]  
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.754s]  
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [20.625s]  
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.755s]  
[INFO] hadoop-yarn-client ................................ SUCCESS [6.748s]  
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.155s]  
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.661s]  
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.160s]  
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [36.090s]  
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.753s]  
[INFO] hadoop-yarn-site .................................. SUCCESS [0.151s]  
[INFO] hadoop-yarn-project ............................... SUCCESS [4.771s]  
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [24.870s]  
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.812s]  
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [15.759s]  
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.831s]  
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [8.126s]  
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [2.320s]  
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [9.596s]  
[INFO] hadoop-mapreduce .................................. SUCCESS [3.905s]  
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.118s]  
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [11.651s]  
[INFO] Apache Hadoop Archives ............................ SUCCESS [2.671s]  
[INFO] Apache Hadoop Rumen ............................... SUCCESS [10.038s]  
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [6.062s]  
[INFO] Apache Hadoop Data Join ........................... SUCCESS [4.104s]  
[INFO] Apache Hadoop Extras .............................. SUCCESS [4.210s]  
[INFO] Apache Hadoop Pipes ............................... SUCCESS [9.419s]  
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [2.306s]  
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.037s]  
[INFO] Apache Hadoop Distribution ........................ SUCCESS [21.579s]  
[INFO] Apache Hadoop Client .............................. SUCCESS [7.299s]  
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [7.347s]  
[INFO] ------------------------------------------------------------------------  
[INFO] BUILD SUCCESS  


编译后的路径在:hadoop-2.4.1-src/hadoop-dist/target/




具体的命令如下:


  810  rpm -ivh jdk-7u60-linux-i586.rpm 
  811  cd /usr/
  812  ks
  813  ls
  814  cd java/
  815  lks
  816  ks
  817  ls
  818  cd jdk1.7.0_60/
  819  ls
  820  cd bin/
  821  java
  822  kk
  823  ll
  824  java
  825  java -v
  826  owd
  827  pwd
  828  cd /etc/profile.d/
  829  vi java.sh
  830  source /etc/profile
  831  java -version
  832  cd 
  833  ls
  834  cd soft/
  835  ls
  836  ll
  837  tar -xvf apache-maven-3.2.2-bin.tar.gz 
  838  ll
  839  mv apache-maven-3.2.2 maven3.2.2
  840  ll
  841  mv -r maven3.2.2 /opt/
  842  mv -R maven3.2.2 /opt/
  843  cp -r maven3.2.2 /opt/
  844  cd /opt/
  845  ls
  846  cd maven3.2.2/
  847  ls
  848  pwd
  849  cd /etc/profile.d/
  850  mkdir maven.sh
  851  vi maven.sh
  852  ll
  853  rm -rf maven.sh/
  854  vi maven.sh
  855  /etc/profile
  856  cd ..
  857  cd 
  858  /etc/profile
  859  source /etc/profile
  860  mvn -version
  861  cd 
  862  ls
  863  cd soft/
  864  ls
  865  ll
  866  rm -rf maven3.2.2
  867  ll
  868  ll
  869  tar -xvf hadoop-2.4.1-src.tar.gz 
  870  ll
  871  ll
  872  ll
  873  df -h
  874  cp -r hadoop-2.4.1-src /opt/
  875  cd /opt/
  876  ls
  877  cd hadoop-2.4.1-src/
  878  ls
  879  ll
  880  成都。。
  881  cd ..
  882  ls
  883  cd maven3.2.2/
  884  ls
  885  cd conf/
  886  ls
  887  more settings.xml 
  888             ll
  889  vi settings.xml 
  890  vi settings.xml 
  891  cd ..
  892  ls
  893  cd ..
  894  ls
  895  cd hadoop-2.4.1-src/
  896  ls
  897  ll
  898  mvn clean install 
  899  more pom.xml 
  900            
  901  cd ..
  902  ls
  903  ll
  904  rm -rf maven3.2.2
  905  cp ~/soft/apache-maven-3.2.2-bin.tar.gz 
  906  cp ~/soft/apache-maven-3.2.2-bin.tar.gz ./
  907  tar -xvf apache-maven-3.2.2-bin.tar.gz 
  908  ll
  909  mv apache-maven-3.2.2 maven3.2.2
  910  ll
  911  rm apache-maven-3.2.2-bin.tar.gz 
  912  ll
  913  maven -version
  914  maven -version
  915  set
  916  source /etc/profile
  917  ll
  918  maven -version
  919  java -version
  920  mvn -version
  921  cd ..
  922  cd /opt/
  923  ls
  924  cd hadoop-2.4.1-src/
  925  ll
  926  mvn clean install
  927  cat /etc/resolv.conf
  928  vi /etc/resolv.conf
  929  ifdown
  930  ifdown eth0
  931  ping baidu.com
  932  ping www.baidu.com
  933  ifconfig
  934  vi /etc/resolv.conf
  935  ping 8.8.8.8
  936  ping 114.114.114.114
  937  vi /etc/resolv.conf
  938  mvn clean install
  939  yum install gcc 
  940  yum intall gcc-c++  
  941   rpm -qa|grep gcc
  942  rz -e
  943  cd 
  944  cd soft/
  945  rz -e
  946  ll
  947  tzr protobuf-2.5.0.tar.gz 
  948  tar -zvf protobuf-2.5.0.tar.gz 
  949  tar -xvf protobuf-2.5.0.tar.gz 
  950  ll
  951  cp protobuf-2.5.0 /opt/
  952  cp -r protobuf-2.5.0 /opt/
  953  ll
  954  rm -rf protobuf-2.5.0
  955  ls
  956  cd /opt/
  957  ls
  958  mv protobuf-2.5.0 protobuf2.5.0
  959  ll
  960  cd protobuf2.5.0/
  961  ll
  962  ./configure --prefix=/opt/protoc/  
  963  make && make install
  964  ll
  965  cd ..
  966  ls
  967  ll
  968  vi /etc/profile.d/protoc.sh
  969  source /etc/profile
  970  protoc --version
  971  more /etc/yum.repos.d/rhel-debuginfo.repo 
  972  rpm -qa|grep yum
  973  yum install
  974  yum install cmake
  975  cd /etc/yum.repos.d
  976  ll
  977  ll
  978  mv rhel-debuginfo.repo rhel-debuginfo.repo.bak
  979  ll
  980  wget http://docs.linuxtone.org/soft/lemp/CentOS-Base.repo
  981  ll
  982  mv CentOS-Base.repo rhel-debuginfo.repo
  983  yum install cmake
  984  yum install openssl-devel 
  985  yum install ncurses-devel
  986  yum install gcc
  987  yum intall gcc-c++ 
  988  yum install make
  989  ll
  990  rpm -qa|grep yum
  991  rpm -qa|grep gcc
  992  cd /opt/hadoop-2.4.1-src/
  993  ll
  994  mvn clean install 
  995  ps -ef |grep java
  996  ps -ef |grep mvn
  997  ps -ef |grep maven
  998  protoc --version
  999  ll
 1000  vi pom.xml 
 1001  mvn package -Pdist,native -DskipTests -Dtar 
 1002  history 

 1003  history >setupcmd.txt





注意:在安装zlib1g-dev 的时候,我始终找不到rpm 包

 这个时候需要在在http://packages.ubuntu.com/lucid/i386/zlib1g-dev/download 下载deb 的包,然后https://packages.debian.org/unstable/source/alien 下载alien


然后 

cd alien
./alien.pl -r ../zlib1g-dev_1.2.3.3.dfsg-15ubuntu1_i386.deb
rpm -ivh zlib1g-dev_1.2.3.3.dfsg-15ubuntu1_i386.rpm



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值