hadoop本地库不一致的解决方案

15/06/25 00:14:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

解决办法:

增加调试信息
[hadoop@master001 native]$ export HADOOP_ROOT_LOGGER=DEBUG,console
[hadoop@master001 native]$ hadoop fs -text /test/data/origz/access.log.gz
15/06/25 00:44:05 DEBUG util.Shell: setsid exited with exit code 0
15/06/25 00:44:05 DEBUG conf.Configuration: parsing URL jar:file:/usr/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar!/core-default.xml
15/06/25 00:44:05 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@71be98f5
15/06/25 00:44:05 DEBUG conf.Configuration: parsing URL file:/usr/hadoop/etc/hadoop/core-site.xml
15/06/25 00:44:05 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@97e1986
15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
15/06/25 00:44:06 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
15/06/25 00:44:06 DEBUG security.Groups:  Creating new Groups object
15/06/25 00:44:06 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
15/06/25 00:44:06 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /usr/hadoop/lib/native/libhadoop.so.1.0.0)
15/06/25 00:44:06 DEBUG util.NativeCodeLoader: java.library.path=/usr/hadoop/lib/native
15/06/25 00:44:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/06/25 00:44:06 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
15/06/25 00:44:06 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
15/06/25 00:44:06 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
15/06/25 00:44:06 DEBUG security.UserGroupInformation: hadoop login
15/06/25 00:44:06 DEBUG security.UserGroupInformation: hadoop login commit
15/06/25 00:44:06 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
15/06/25 00:44:06 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
15/06/25 00:44:06 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
15/06/25 00:44:06 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@78dd667e
15/06/25 00:44:07 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@60dcc9fe
15/06/25 00:44:07 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
15/06/25 00:44:07 DEBUG ipc.Client: The ping interval is 60000 ms.
15/06/25 00:44:07 DEBUG ipc.Client: Connecting to master001/192.168.75.155:8020
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: starting, having connections 1
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop sending #0
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop got value #0
15/06/25 00:44:07 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 71ms
text: `/test/data/origz/access.log.gz': No such file or directory
15/06/25 00:44:07 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@60dcc9fe
15/06/25 00:44:07 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@60dcc9fe
15/06/25 00:44:07 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@60dcc9fe
15/06/25 00:44:07 DEBUG ipc.Client: Stopping client
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: closed
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: stopped, remaining connections 0
查看系统的libc版本
[hadoop@master001 native]$ ll /lib64/libc.so.6
lrwxrwxrwx. 1 root root 12 Apr 14 16:14 /lib64/libc.so.6 -> libc-2.12.so
显示版本为2.12
到网站http://ftp.gnu.org/gnu/glibc/
下载glibc-2.14.tar.bz2
下载glibc-linuxthreads-2.5.tar.bz2
[hadoop@master001 native]$ tar -jxvf /home/hadoop/software/glibc-2.14.tar.bz2
[hadoop@master001 native]$ cd glibc-2.14/
[hadoop@master001 glibc-2.14]$ tar -jxvf /home/hadoop/software/glibc-linuxthreads-2.5.tar.bz2
[hadoop@master001 glibc-2.14]$ cd .. #必须返回上级目录
[hadoop@master001 native]$ export CFLAGS="-g -O2"           #加上优化开关,否则会出现错误,必须用root用户
[hadoop@master001 native]$ ./glibc-2.14/configure --prefix=/usr --disable-profile --enable-add-ons --with-headers=/usr/include --with-binutils=/usr/bin
[hadoop@master001 native]$ make        #编译,执行很久,可能出错,出错再重新执行
[hadoop@master001 native]$ sudo make install   #安装,必须root用户执行
#验证版本是否升级
[hadoop@master001 native]$ ll /lib64/libc.so.6
lrwxrwxrwx 1 root root 12 Jun 25 02:07 /lib64/libc.so.6 -> libc-2.14.so #显示2.14
增加调试信息
[hadoop@master001 native]$ export HADOOP_ROOT_LOGGER=DEBUG,console
#显示有下面红色部分,说明本地库不再有错误
[hadoop@master001 native]$ hadoop fs -text /test/data/origz/access.log.gz
15/06/25 02:10:01 DEBUG util.Shell: setsid exited with exit code 0
15/06/25 02:10:01 DEBUG conf.Configuration: parsing URL jar:file:/usr/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar!/core-default.xml
15/06/25 02:10:01 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@71be98f5
15/06/25 02:10:01 DEBUG conf.Configuration: parsing URL file:/usr/hadoop/etc/hadoop/core-site.xml
15/06/25 02:10:01 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@97e1986
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
15/06/25 02:10:02 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
15/06/25 02:10:02 DEBUG security.Groups:  Creating new Groups object
15/06/25 02:10:02 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
15/06/25 02:10:02 DEBUG util.NativeCodeLoader: <span style="color:#ff0000;">Loaded the native-hadoop library</span>
15/06/25 02:10:02 DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
15/06/25 02:10:02 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
15/06/25 02:10:02 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
15/06/25 02:10:02 DEBUG security.UserGroupInformation: hadoop login
15/06/25 02:10:02 DEBUG security.UserGroupInformation: hadoop login commit
15/06/25 02:10:02 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
15/06/25 02:10:02 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
15/06/25 02:10:03 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
15/06/25 02:10:03 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@501edcf1
15/06/25 02:10:03 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$1@7e499e08: starting with interruptCheckPeriodMs = 60000
15/06/25 02:10:04 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
15/06/25 02:10:04 DEBUG ipc.Client: The ping interval is 60000 ms.
15/06/25 02:10:04 DEBUG ipc.Client: Connecting to master001/192.168.75.155:8020
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop sending #0
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: starting, having connections 1
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop got value #0
15/06/25 02:10:04 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 122ms
text: `/test/data/origz/access.log.gz': No such file or directory
15/06/25 02:10:04 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: Stopping client
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: closed
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: stopped, remaining connections 0
==============================
完成之后,需要将集群重启
[hadoop@master001 ~]$ sh /usr/hadoop/sbin/start-dfs.sh
[hadoop@master001 ~]$ sh /usr/hadoop/sbin/start-yarn.sh
[hadoop@master001 ~]$ hadoop fs -ls /
[hadoop@master001 ~]$ hadoop fs -mkdir /usr
[hadoop@master001 ~]$ hadoop fs -ls /
Found 1 items
drwxr-xr-x   - hadoop supergroup          0 2015-06-25 02:27 /usr
  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值