转载自:http://guoyunsky.iteye.com/blog/1237327
ERROR lzo.GPLNativeCodeLoader: Could not load native gpl library java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1738) at java.lang.Runtime.loadLibrary0(Runtime.java:823) at java.lang.System.loadLibrary(System.java:1028) at com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:32) at com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.java:71) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:247) at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943) at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89) at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:134) at com.twitter.elephantbird.mapreduce.input.LzoRecordReader.initialize(LzoRecordReader.java:61) at com.twitter.elephantbird.mapreduce.input.LzoBinaryB64LineRecordReader.initialize(LzoBinaryB64LineRecordReader.java:79) at org.apache.hadoop.mapreduce.lib.input.DelegatingRecordReader.initialize(DelegatingRecordReader.java:80) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:450) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210) 11/11/07 10:15:02 ERROR lzo.LzoCodec: Cannot load native-lzo without native-hadoop 11/11/07 10:15:02 WARN mapred.LocalJobRunner: job_local_0001 java.lang.RuntimeException: native-lzo library not available at com.hadoop.compression.lzo.LzopCodec.createDecompressor(LzopCodec.java:91) at com.hadoop.compression.lzo.LzopCodec.createInputStream(LzopCodec.java:76) at com.twitter.elephantbird.mapreduce.input.LzoRecordReader.initialize(LzoRecordReader.java:71) at com.twitter.elephantbird.mapreduce.input.LzoBinaryB64LineRecordReader.initialize(LzoBinaryB64LineRecordReader.java:79) at org.apache.hadoop.mapreduce.lib.input.DelegatingRecordReader.initialize(DelegatingRecordReader.java:80) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:450) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210)
这里要感谢google,但搜索不是那么容易.网上一些解决方法并不全面.我这里整理下.
首先搜到的是http://p-x1984.iteye.com/blog/1157145,里面说是JDK版本原因,要将32位的JDK改成64位. 于是我输入命令查看下JDK版本:java -version.(你可以先通过uname -a查看你的操作系统位数)
得到的结果发现已经是64位,结果如下:
java version "1.6.0_26"
Java(TM) SE Runtime Environment (build 1.6.0_26-b03) Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02, mixed mode)
后来想下自己压根就没有装什么lzo,同时找到了这篇:http://comments.gmane.org/gmane.comp.java.hadoop.hbase.user/15776.验证了我的想法,根据里面的提示http://code.google.com/a/apache-extras.org/p/hadoop-gpl-compression/wiki/FAQ?redir=1一步步安装.命令如下:
1.先安装hadoop-gpl-compression
1)下载(我一般都会进入自己的下载目录,命令:cd $HOME/Downloads,由你自己决定):
wget http://hadoop-gpl-compression.googlecode.com/files/hadoop-gpl-compression-0.1.0-rc0.tar.gz
2)解压
tar -xvf hadoop-gpl-compression-0.1.0-rc0.tar.gz
3)将解压好的hadoop-gpl-compression-0.1.0.jar移到你的hadoop/lib目录(我这里是$HADOOP_HOME/lib
,如果没有请设置$HADOOP_HOME,或者替换成你的hadoop目录)下:
mv hadoop-gpl-compression-0.1.0/hadoop-gpl-compression-0.1.0.jar $HADOOP_HOME/lib/
4)把库文件拷贝到hadoop的本地库目录中
mv hadoop-gpl-compression-0.1.0/lib/native/Linux-i386-32/* $HADOOP_HOME/lib/native/Linux-i386-32/
mv hadoop-gpl-compression-0.1.0/lib/native/Linux-amd64-64/* $HADOOP_HOME/lib/native/Linux-amd64-64/
2.安装lzo类库
1)安装g++和gcc
sudo apt-get install g++
sudo apt-get install gcc
2)下载lzo
wget http://www.oberhumer.com/opensource/lzo/download/lzo-2.03.tar.gz
3)解压
tar -xvf lzo-2.03.tar.gz
4)编译安装
cd lzo-2.03
sudo ./configure
sudo make
sudo make install
Eclipse调试相关源码时,如果还报以上错误,那有可能是Eclipse获取不到/usr/local/lib.有两种解决方法:
一种是将/usr/local/lib放入你的环境变量.如: vi ~./bashrc,加入以下行:
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
然后通过命令启动Eclipse,也就是输入eclipse,当然前提是要你设置了Path.或者重启系统;
还有一种是将Native Library Location设置进你的工程,具体做法是.
右击你的工程->选择Properties->选择左边的Java Build Path->然后在右边的Libraries下找到hadoop-lzo-xxx.jar.
点击该jar,双击该jar下的Native Library Location,设置为/usr/local/lib即可.如此就无需重启系统之类的.但长期来看还是建议第一种方法,设置环境变量,省得每次都需要设置
import java.io.File;
import java.io.FileInputStream;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.io.compress.CompressionCodec;
import org.apache.hadoop.io.compress.CompressionOutputStream;
import org.apache.hadoop.util.ReflectionUtils;
public class StreamCompressor {
public static void main(String[] args) throws Exception {
String cdoecClassName = args[0];
FileInputStream in = new FileInputStream(new File("/home/conkeyn/jar/text1.txt"));
Class<?> codecClass = Class.forName(cdoecClassName);
Configuration conf = new Configuration();
CompressionCodec codec = (CompressionCodec) ReflectionUtils.newInstance(codecClass, conf);
CompressionOutputStream out = codec.createOutputStream(System.out);
IOUtils.copyBytes(in, out, 4096, false);
out.finish();
}
}