RHEL 5下配置Hadoop集群:java.net.NoRouteToHostException: No route to host问题的解决

最近,要把原来基于Ubuntu下配置的Hadoop集群迁移到RHEL 5下,结果在启动的时候,出现了莫名其妙的问题:

Namenode进程启动起来了,但是在登录到Datanode上启动集群slaves的时候,出现了Datanode无法连接到Namenode的问题,根据Datanode日志,可以很容易地定位到问题,肯定是在集群启动之初,Datanode向Namenode注册的时候失败了,在Datanode端查看异常信息,如下所示:

[shirdrn@slave-01 ~]$ tail -500f hadoop/storage/logs/hadoop-shirdrn-datanode-slave-01.log 
2012-02-20 23:54:02,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = slave-01/192.168.0.181
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.22.0
STARTUP_MSG:   classpath = /home/shirdrn/hadoop/hadoop-0.22.0/bin/../conf:/home/shirdrn/installation/jdk1.6.0_30/lib/tools.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/..:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-common-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-common-test-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-0.22.0-sources.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-ant-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-test-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-test-0.22.0-sources.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-mapred-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-mapred-0.22.0-sources.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-mapred-examples-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-mapred-test-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-mapred-tools-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-1.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-launcher-1.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/asm-3.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/aspectjrt-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/aspectjtools-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-compiler-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-ipc-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-cli-1.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-codec-1.4.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-collections-3.2.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-daemon-1.0.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-el-1.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-httpclient-3.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-lang-2.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-logging-1.1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-logging-api-1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-net-1.4.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/core-3.1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ecj-3.5.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/guava-r09.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/hsqldb-1.8.0.10.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jackson-core-asl-1.7.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jackson-mapper-asl-1.7.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jasper-compiler-5.5.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jasper-runtime-5.5.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jdiff-1.0.9.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jets3t-0.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jetty-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jetty-util-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsch-0.1.42.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-2.1-glassfish-2.1.v20091210.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-2.1-jetty-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-api-2.1-glassfish-2.1.v20091210.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/junit-4.8.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/kfs-0.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/log4j-1.2.16.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/mockito-all-1.8.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/mockito-all-1.8.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/oro-2.0.8.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-ant-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-generator-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/qdox-1.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/servlet-api-2.5-20081211.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/slf4j-api-1.6.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/slf4j-log4j12-1.6.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/snappy-java-1.0.3.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/velocity-1.6.4.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/xmlenc-0.52.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-2.1/*.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/..:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../conf:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-0.22.0-sources.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-ant-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-test-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-test-0.22.0-sources.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-1.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-launcher-1.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/asm-3.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/aspectjrt-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/aspectjtools-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-compiler-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-ipc-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-cli-1.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-codec-1.4.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-collections-3.2.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-daemon-1.0.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-el-1.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-httpclient-3.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-lang-2.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-logging-1.1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-logging-api-1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-net-1.4.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/core-3.1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ecj-3.5.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/guava-r09.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/hsqldb-1.8.0.10.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jackson-core-asl-1.7.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jackson-mapper-asl-1.7.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jasper-compiler-5.5.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jasper-runtime-5.5.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jdiff-1.0.9.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jets3t-0.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jetty-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jetty-util-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsch-0.1.42.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-2.1-glassfish-2.1.v20091210.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-2.1-jetty-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-api-2.1-glassfish-2.1.v20091210.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/junit-4.8.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/kfs-0.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/log4j-1.2.16.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/mockito-all-1.8.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/mockito-all-1.8.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/oro-2.0.8.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-ant-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-generator-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/qdox-1.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/servlet-api-2.5-20081211.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/slf4j-api-1.6.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/slf4j-log4j12-1.6.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/snappy-java-1.0.3.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/velocity-1.6.4.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/xmlenc-0.52.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/..:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-0.22.0-sources.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-ant-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-test-0.22.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../hadoop-hdfs-test-0.22.0-sources.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-1.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ant-launcher-1.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/asm-3.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/aspectjrt-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/aspectjtools-1.6.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-compiler-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/avro-ipc-1.5.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-cli-1.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-codec-1.4.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-collections-3.2.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-daemon-1.0.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-el-1.0.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-httpclient-3.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-lang-2.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-logging-1.1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-logging-api-1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/commons-net-1.4.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/core-3.1.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/ecj-3.5.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/guava-r09.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/hsqldb-1.8.0.10.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jackson-core-asl-1.7.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jackson-mapper-asl-1.7.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jasper-compiler-5.5.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jasper-runtime-5.5.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jdiff-1.0.9.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jets3t-0.7.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jetty-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jetty-util-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsch-0.1.42.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-2.1-glassfish-2.1.v20091210.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-2.1-jetty-6.1.26.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/jsp-api-2.1-glassfish-2.1.v20091210.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/junit-4.8.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/kfs-0.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/log4j-1.2.16.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/mockito-all-1.8.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/mockito-all-1.8.5.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/oro-2.0.8.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-ant-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/paranamer-generator-2.3.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/qdox-1.12.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/servlet-api-2.5-20081211.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/slf4j-api-1.6.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/slf4j-log4j12-1.6.1.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/snappy-java-1.0.3.2.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/velocity-1.6.4.jar:/home/shirdrn/hadoop/hadoop-0.22.0/bin/../lib/xmlenc-0.52.jar
STARTUP_MSG:   build = http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.22/common -r 1207774; compiled by 'jenkins' on Sun Dec  4 00:57:22 UTC 2011
************************************************************/
2012-02-20 23:54:02,416 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/shirdrn/hadoop/storage/data/a should be specified as a URI in configuration files. Please update hdfs configuration.
2012-02-20 23:54:02,417 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/shirdrn/hadoop/storage/data/b should be specified as a URI in configuration files. Please update hdfs configuration.
2012-02-20 23:54:02,417 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/shirdrn/hadoop/storage/data/c should be specified as a URI in configuration files. Please update hdfs configuration.
2012-02-20 23:54:03,128 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2012-02-20 23:54:04,373 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 0 time(s).
2012-02-20 23:54:05,376 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 1 time(s).
2012-02-20 23:54:06,379 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 2 time(s).
2012-02-20 23:54:07,382 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 3 time(s).
2012-02-20 23:54:08,384 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 4 time(s).
2012-02-20 23:54:09,385 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 5 time(s).
2012-02-20 23:54:10,387 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 6 time(s).
2012-02-20 23:54:11,389 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 7 time(s).
2012-02-20 23:54:12,426 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 8 time(s).
2012-02-20 23:54:13,432 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: master/192.168.0.180:9000. Already tried 9 time(s).
2012-02-20 23:54:13,445 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call to master/192.168.0.180:9000 failed on local exception: java.net.NoRouteToHostException: No route to host
        at org.apache.hadoop.ipc.Client.wrapException(Client.java:1063)
        at org.apache.hadoop.ipc.Client.call(Client.java:1031)
        at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:198)
        at $Proxy4.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:235)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:275)
        at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:206)
        at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:185)
        at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:169)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:262)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1567)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1510)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1533)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1680)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1690)
Caused by: java.net.NoRouteToHostException: No route to host
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:373)
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:416)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:504)
        at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:206)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1164)
        at org.apache.hadoop.ipc.Client.call(Client.java:1008)
        ... 13 more

2012-02-20 23:54:13,447 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at slave-01/192.168.0.181
************************************************************/
首先确定,Namenode的9000端口是否在监听:

[shirdrn@master ~]$ netstat -nap | grep 9000
(Not all processes could be identified, non-owned process info
 will not be shown, you would have to be root to see it all.)
tcp        0      0 192.168.0.180:9000          0.0.0.0:*                   LISTEN      5374/java 
可见,Namenode启动没有问题。

在网上查了一下,有人说(http://hi.baidu.com/tdfrank/blog/item/a1b9e1d95e3b013f10df9b01.html)是,hostname和hosts中配置的虚拟域名不一致会导致出现No route to host异常,经过验证:

master结点:

[shirdrn@master ~]$ cat /etc/hosts
# Do not remove the following line, or various programs
# that require network functionality will fail.
127.0.0.1               localhost localhost
#::1            localhost6.localdomain6 localhost6

192.168.0.180   master master
192.168.0.181   slave-01        slave-01
192.168.0.182   slave-02        slave-02
192.168.0.183   slave-03        slave-03
[shirdrn@master ~]$ cat /etc/sysconfig/network
NETWORKING=yes
NETWORKING_IPV6=no
HOSTNAME=master
slave-01结点:

[shirdrn@slave-01 ~]$ cat /etc/sysconfig/network
NETWORKING=yes
NETWORKING_IPV6=no
HOSTNAME=slave-01
[shirdrn@slave-01 ~]$ cat /etc/hosts
# Do not remove the following line, or various programs
# that require network functionality will fail.
127.0.0.1               localhost localhost
#::1            localhost6.localdomain6 localhost6

192.168.0.180   master  master
192.168.0.181   slave-01        slave-01
192.168.0.182   slave-02        slave-02
192.168.0.183   slave-03        slave-03
另外两个从结点也没有问题,此处略去。

经过对比验证,都没有问题。

其实,我已经想到可能是防火墙的问题,经过查证,终于找到了有人曾经遇到过该问题,可见文章http://samwalt.iteye.com/blog/1099348,需要关闭防火墙,在集群中每台机器上都执行如下命令:

[shirdrn@master ~]$ su root
Password: 
[root@master shirdrn]# service iptables stop
Flushing firewall rules: [  OK  ]
Setting chains to policy ACCEPT: filter [  OK  ]
Unloading iptables modules: [  OK  ]
需要切换到root用户才能执行关闭。

然后,再启动Hadoop集群,就没有问题了。

  • 2
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 3
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值