配置Hadoop2.7.2和Hbase1.1.5支持Snappy解压压缩库

.Hadoop支持Snappy

1.重新编译Hadoop 2.7.2源代码,使其支持Snappy解压压缩库  http://blog.itpub.net/30089851/viewspace-2120631/

2.查看libsnappy.so.1.2.0

[root@sht-sgmhadoopnn-01 ~]# ll $HADOOP_HOME/lib/native/

total 4880

-rw-r--r-- 1 root root 1211196 Jun 21 19:11 libhadoop.a

-rw-r--r-- 1 root root 1485756 Jun 21 19:12 libhadooppipes.a

lrwxrwxrwx 1 root root      18 Jun 21 19:45 libhadoop.so -> libhadoop.so.1.0.0

-rwxr-xr-x 1 root root  717060 Jun 21 19:11 libhadoop.so.1.0.0

-rw-r--r-- 1 root root  582128 Jun 21 19:12 libhadooputils.a

-rw-r--r-- 1 root root  365052 Jun 21 19:11 libhdfs.a

lrwxrwxrwx 1 root root      16 Jun 21 19:45 libhdfs.so -> libhdfs.so.0.0.0

-rwxr-xr-x 1 root root  229289 Jun 21 19:11 libhdfs.so.0.0.0

-rw-r--r-- 1 root root  233538 Jun 21 19:11 libsnappy.a

-rwxr-xr-x 1 root root     953 Jun 21 19:11 libsnappy.la

lrwxrwxrwx 1 root root      18 Jun 21 19:45 libsnappy.so -> libsnappy.so.1.2.0

lrwxrwxrwx 1 root root      18 Jun 21 19:45 libsnappy.so.1 -> libsnappy.so.1.2.0

-rwxr-xr-x 1 root root  147726 Jun 21 19:11 libsnappy.so.1.2.0 

 [root@sht-sgmhadoopnn-01 ~]#
###假如集群已经安装好

3.修改$HADOOP_HOME/etc/hadoop/hadoop-env.sh,添加

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib:$HADOOP_HOME/lib/native"

 

############会解决Warn:” Unable to load native-hadoop library ”################################################

 [root@sht-sgmhadoopnn-01 ~]# hadoop fs -ls /

  16/06/21 15:08:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using     builtin-java classes where applicable

##################################################################################################

4.修改$HADOOP_HOME/etc/hadoop/core-site.xml

点击(此处)折叠或打开

  1. <property>
  2.  
  3.   <name>io.compression.codecs</name>
  4.  
  5.   <value>org.apache.hadoop.io.compress.GzipCodec,
  6.  
  7.     org.apache.hadoop.io.compress.DefaultCodec,
  8.  
  9.     org.apache.hadoop.io.compress.BZip2Codec,
  10.  
  11.     org.apache.hadoop.io.compress.SnappyCodec
  12.  
  13.   </value>
  14.  
  15. </property>

5.修改 $HADOOP_HOME/etc/hadoop/mapred-site.xml 中有关压缩属性,测试snappy

点击(此处)折叠或打开

  1. <property>
  2.  
  3.       <name>mapreduce.map.output.compress</name>
  4.  
  5.       <value>true</value>
  6.  
  7.   </property>
  8.  
  9.                
  10.  
  11.   <property>
  12.  
  13.       <name>mapreduce.map.output.compress.codec</name>
  14.  
  15.       <value>org.apache.hadoop.io.compress.SnappyCodec</value>
  16.  
  17.    </property>

6.新增$HADOOP_HOME/etc/hadoop/yarn-site.xml(是否启用日志聚集功能、yarn日志服务的地址、配置yarn的memory和cpu)

点击(此处)折叠或打开

  1. <property>
  2.           <name>yarn.log-aggregation-enable</name>
  3.          <value>true</value>
  4. </property>
  5. <property>
  6.          <name>yarn.log.server.url</name>
  7.          <value>http://sht-sgmhadoopnn-01:19888/jobhistory/logs</value>
  8. </property>

  9. <property>
  10.     <name>yarn.nodemanager.resource.memory-mb</name>
  11.     <value>10240</value>
  12. </property>
  13. <property>
  14.     <name>yarn.scheduler.minimum-allocation-mb</name>
  15.     <value>1500</value>
  16.     <discription>单个任务可申请最少内存,默认1024MB</discription>
  17. </property>
  18.   
  19. <property>
  20.     <name>yarn.scheduler.maximum-allocation-mb</name>
  21.     <value>2500</value>
  22.     <discription>单个任务可申请最大内存,默认8192MB</discription>
  23.  </property>
  24. <property>
  25.      <name>yarn.nodemanager.resource.cpu-vcores</name>
  26.     <value>2</value>
  27. </property>

7.hadoop-env.sh, core-site.xml, mapred-site.xml,yarn-site.xml 同步到集群其他节点

8.重启Hadoop集群

9.验证1: hadoop checknative

[root@sht-sgmhadoopnn-01 ~]# hadoop checknative

16/06/25 12:58:13 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version

16/06/25 12:58:13 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library

Native library checking:

hadoop:  true /hadoop/hadoop-2.7.2/lib/native/libhadoop.so.1.0.0

zlib:    true /usr/local/lib/libz.so.1

snappy:  true /hadoop/hadoop-2.7.2/lib/native/libsnappy.so.1

lz4:     true revision:99

bzip2:   false

openssl: true /usr/lib64/libcrypto.so

[root@sht-sgmhadoopnn-01 ~]#

####支持本地native,支持snappy

10.验证2

[root@sht-sgmhadoopnn-01 ~]# vi test.log

a

c d

c d d d a

1 2

a


[root@sht-sgmhadoopnn-01 ~]# hadoop fs -mkdir /input

[root@sht-sgmhadoopnn-01 ~]#hadoop fs -put test.log /input/

[root@sht-sgmhadoopnn-01 ~]#hadoop jar /hadoop/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar wordcount /input /output1

为了验证是否成功,往


  

  1. >>>
  2.  

  3.  


  4.  
  5. >
  6.  
  7. >

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值