CDH4.5.0下安装snappy

编译源代码 http://www.cnblogs.com/chengxin1982/p/3862289.html

测试参考 http://blog.jeoygin.org/2012/03/java-compression-library-test.html
1 snappy
参考地址
 http://sstudent.blog.51cto.com/7252708/1405485(主)
 http://wzxwzx2011.blog.51cto.com/2997448/1111619

snappy库: wget http://pkgs.fedoraproject.org/repo/pkgs/snappy/snappy-1.1.1.tar.gz/8887e3b7253b22a31f5486bca3cbc1c2/snappy-1.1.1.tar.gz
tar -zxvf  snappy-1.0.5.tar.gz
cd snappy-1.0.5
./configure
make
sudo make install
或者  sudo yum install snappy snappy-devel(centos)


安装hadoop-snappy包 :https://github.com/electrum/hadoop-snappy
    sudo apt-get install automake libtool
    cd hadoop-snappy
    mvn package


修改配置文件core-site.xml
 <property>
    <name>io.compression.codecs</name>
        <value>
                org.apache.hadoop.io.compress.GzipCodec,
                org.apache.hadoop.io.compress.DefaultCodec,
                org.apache.hadoop.io.compress.BZip2Codec,
                org.apache.hadoop.io.compress.SnappyCodec
        </value>
</property>


错误1 Cannot run program "autoreconf"
解决办法: apt-get install autoconf,automake,libtool(参考 http://www.cnblogs.com/shitouer/archive/2013/01/05/2845954.html)

错误 2
java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
    at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
    at org.apache.hadoop.io.compress.SnappyCodec.createCompressor(SnappyCodec.java:138)
    at org.apache.hadoop.io.compress.SnappyCodec.createOutputStream(SnappyCodec.java:93)
    at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:136)
    at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.<init>(ReduceTask.java:562)
    at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:636)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:404)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:443)

解决方法
mdkir -p $HADOOP_HOME/lib/native/Linux-amd64-64

cp -r hadoop-snappy-0.0.1-SNAPSHOT/lib/native/Linux-amd64-64/* $HADOOP_HOME/lib/native/Linux-amd64-64
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/sbin/Linux-amd64-64/:/usr/local/lib/

错误3 DefaultCodec/GZipCodec not found
修改core-site.xml
  <property>
        <name>io.compression.codecs</name> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.BZip2Codec,org.apache.hadoop.io.compress.SnappyCodec</value>
去掉空格



配置hbase
mkdir -p $HBASE_HOME/lib/native/Linux-amd64-64/
cp -r $HADOOP_HOME/lib/native/Linux-amd64-64/* $HBASE_HOME/lib/native/Linux-amd64-64/
cp hadoop-snappy-0.0.1-SNAPSHOT.jar $HBASE_HOME/lib/

使用
conf.setBoolean("mapreduce.map.output.compress", true);
    conf.setClass("mapreduce.map.output.compression.codec", SnappyCodec.class, CompressionCodec.class);
    conf.setBoolean("mapreduce.output.fileoutputformat.compress", true);  // 设置是否压缩输出
    conf.setClass("mapreduce.output.fileoutputformat.compress.codec", SnappyCodec.class, CompressionCodec.class);



转载于:https://www.cnblogs.com/chengxin1982/p/3862309.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值