[oracle@master opt]$ cd /opt/hadoopdata/hdfs
[oracle@master hdfs]$ ls
data name snn
[oracle@master hdfs]$ start-dfs.sh
18/07/15 06:06:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: Error: JAVA_HOME is not set and could not be found.
slave2: Error: JAVA_HOME is not set and could not be found.
slave1: Error: JAVA_HOME is not set and could not be found.
master: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [slave1]
slave1: Error: JAVA_HOME is not set and could not be found.
18/07/15 06:06:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[oracle@master hdfs]$
[oracle@master native]$ ldd libhadoop.so.1.0.0
./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
linux-vdso.so.1 => (0x00007fff386ea000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007fa46cdc3000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007fa46cba5000)
libc.so.6 => /lib64/libc.so.6 (0x00007fa46c811000)
/lib64/ld-linux-x86-64.so.2 (0x000055af585c3000)
[oracle@master native]$
[oracle@master native]$ ldd --version
ldd (GNU libc) 2.12
Copyright (C) 2010 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Written by Roland McGrath and Ulrich Drepper.
[oracle@master native]$
原来系统预装的glibc库是2.12版本,而hadoop期望是2.14版本,所以打印警告信息。
现在有两个办法,重新编译glibc.2.14版本,安装后专门给hadoop使用,这个有点危险。
第二个办法直接在log4j日志中去除告警信息。在/opt/hadoop/etc/hadoop/log4j.properties文件中添加log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
在hadoop-env.sh添加JAVA_HOME路径
[oracle@master hadoop]$ pwd
/opt/hadoop/etc/hadoop
[oracle@master hadoop]$ vi hadoop-env.sh
# The java implementation to use.
#export JAVA_HOME=${JAVA_HOME}
export JAVA_HOME=/opt/jdk
# The jsvc implementation to use. Jsvc is required to run secure datanodes
# that bind to privileged ports to provide authentication of data transfer
"hadoop-env.sh" 118L, 4996C written