Hadoop-append分支源码编译

1、安装hadoop-0.20.2+zookeeper-3.4.6+hbase-0.90.6完成之候,在http://10.10.10.70:60010页面的时候出现如下信息:
  1. You are currently running the HMaster without HDFS append support enabled. This may result in data loss. Please see the HBase wiki for details.
2、这是因为hadoop版本和hbase版本不兼容,会出现数据丢失
3、需要安装hadoop branch,步骤
安装git
  1. grid@master1:/usr/local/hadoop$ sudo apt-get install -y git
安装hadoop-common
  1. grid@master1:/usr/local/hadoop$ git clone http://git.apache.org/hadoop-common.git
  2. Cloning into 'hadoop-common'...
  3. Checking connectivity... done.
  4. grid@master1:/usr/local/hadoop$ cd hadoop-common/
切换并验证
  1. grid@master1:/usr/local/hadoop/hadoop-common$ git checkout -t origin/branch-0.20-append
  2. Branch branch-0.20-append set up to track remote branch branch-0.20-append from origin.
  3. Switched to a new branch 'branch-0.20-append'

  4. grid@master1:/usr/local/hadoop/hadoop-common$ git show-branch release-0.20.2 branch-0.20-append
  5. ! [release-0.20.2] Hadoop 0.20.2 release
  6.  * [branch-0.20-append] HDFS-1779. After NameNode restart , Clients can not read partial files even after client invokes Sync. Contributed by Uma Maheswara Rao G.
  7. --
  8.  * [branch-0.20-append] HDFS-1779. After NameNode restart , Clients can not read partial files even after client invokes Sync. Contributed by Uma Maheswara Rao G.
  9. ....................
  10. ...................
创建build.properties
  1. grid@master1:/usr/local/hadoop/hadoop-common$ vi ../build.properties

  2. resolvers=internal
  3. version=0.20-append-for-hbase
  4. project.version=${version}
  5. hadoop.version=${version}
  6. hadoop-core.version=${version}
  7. hadoop-hdfs.version=${version}
  8. hadoop-mapred.version=${version}

grid@master1:/usr/local/hadoop/hadoop-common$ ln -s ../build.properties build.properties

安装ant
  1. grid@master1:/usr/local/hadoop/hadoop-common$ sudo apt-get install -y ant
编译
  1. grid@master1:/usr/local/hadoop/hadoop-common$ ant mvn-install
  2. .....................................

 BUILD SUCCESSFUL
 Total time: 2 minutes 18 seconds

测试编译的文件,时间太长,中途停止了
  1. ant test
  2. ant test-core
替换hadoop的jar包,需重命名
  1. grid@master1:/usr/local/hadoop$ mv hadoop-0.20.2-core.jar hadoop-0.20.2-core.jar.bak

  2. grid@master1:/usr/local/hadoop$ cp hadoop-common/build/hadoop-core-0.20-append-for-hbase.jar hadoop-0.20-append-for-hbase-core.jar
替换hbase的jar包,无需重命名
  1. grid@master1:/usr/local/hbase/lib$ mv hadoop-0.20.2-core.jar hadoop-0.20.2-core.jar.bak

  2. grid@master1:/usr/local/hadoop/hadoop-common/build$ cp hadoop-core-0.20-append-for-hbase.jar /usr/local/hbase/lib/
4、重启所有服务


参考文档:
http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-version-for-hbase-0-90-2/

来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/12219480/viewspace-1778955/,如需转载,请注明出处,否则将追究法律责任。

转载于:http://blog.itpub.net/12219480/viewspace-1778955/

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值