今天遇到一个错误,在本地往Hadoop写文件的时候报了一个错误:
SequenceFile doesn't work with GzipCodec without native-hadoop code!
SequenceFile有个判断会抛出这个exception
private static Writer
createWriter(Configuration conf, FSDataOutputStream out,
Class keyClass, Class valClass, boolean compress, boolean blockCompress,
CompressionCodec codec, Metadata metadata)
throws IOException {
if (codec != null && (codec instanceof GzipCodec) &&
!NativeCodeLoader.isNativeCodeLoaded() &&
!ZlibFactory.isNativeZlibLoaded(conf)) {
throw new IllegalArgumentException("SequenceFile doesn't work with " +
"GzipCodec without native-hadoop code!");
}
通过这段代码可以测试这个错误。
import org.apache.hadoop.io.compress.zlib.ZlibCompressor;
public class MyTester {
public static void main(String[] args) {
new ZlibCompressor();
}
}
如何避免?
有两种解决方案:
1. 把hadoop.native.lib设置为false
2. 把/duitang/dist/sys/hadoop-1.2.1/lib/native/Linux-amd64-64/*下的so文件copy到/duitang/dist/sys/java/jre/lib/amd64