Hadoop Native Libraries


https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html


WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable


参考:

http://blog.csdn.net/xichenguan/article/details/38797331

http://www.aboutyun.com/thread-7175-1-1.html

http://blog.csdn.net/lalaguozhe/article/details/10580727

http://stackoverflow.com/questions/19943766/hadoop-unable-to-load-native-hadoop-library-for-your-platform-warning


The answer depends... I just installed Hadoop 2.6 from tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:


/opt/hadoop/lib/native/libhadoop.so.1.0.0
And I know it is 64-bit:


[hadoop@VMWHADTEST01 native]$ ldd libhadoop.so.1.0.0
./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
linux-vdso.so.1 =>  (0x00007fff43510000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f9be553a000)
libc.so.6 => /lib64/libc.so.6 (0x00007f9be51a5000)
/lib64/ld-linux-x86-64.so.2 (0x00007f9be5966000)
Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":


`GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):


15/06/18 18:59:23 WARN util.NativeCodeLoader: Unable to load native-hadoop    library for your platform... using builtin-java classes where applicable
So, off to here to see what it does:


http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/


Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:


log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG
Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:


15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)
And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:


`GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)
What version of GLIBC do I have? Here's simple trick to find out:


[hadoop@VMWHADTEST01 hadoop]$ ldd --version
ldd (GNU libc) 2.12
So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.


log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.


  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值