【Hadoop】- Hadoop异常处理

异常1:

2016-12-31 22:39:45,304 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: NameNode/192.168.174.128:9090. Already tried 9 time(s).
2016-12-31 22:39:46,314 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call to NameNode/192.168.174.128:9090 failed on local exception: java.net.NoRouteToHostException: No route to host
	at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
	at org.apache.hadoop.ipc.Client.call(Client.java:743)
	at org.apache.hadoop.ipc.RPC\$Invoker.invoke(RPC.java:220)
	at com.sun.proxy.\$Proxy4.getProtocolVersion(Unknown Source)
	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:346)
	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:383)
	at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:314)
	at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:291)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:269)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:216)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1283)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1238)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1246)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1368)
Caused by: java.net.NoRouteToHostException: No route to host
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
	at org.apache.hadoop.ipc.Client\$Connection.access\$1700(Client.java:176)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
	at org.apache.hadoop.ipc.Client.call(Client.java:720)
	... 13 more
2016-12-31 22:39:46,316 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
************************************************************
SHUTDOWN_MSG: Shutting down DataNode at DataNode_01/192.168.174.129
************************************************************

解决方法:DataNode节点无法连接NameNode节点的原因linux的防火墙的某些规则限制了maters-slaves之间的通讯,直接关闭linux的防火墙即可:sudo service iptables stop(每个节点都执行一遍) 上述问题会导致DataNode进程启动一会后又自动关闭


异常2:

问题:hadoop启动之后,slave节点的dataNode节点未正常启动

解决方法:检查hdfs-site.xml文件关于数据存储位置的配置,如果配置的文件不存在则不会启动DataNode

问题:hadoop启动mapreduce作业莫名出现map可处理,reduce就出现空指针等异常,但是代码和配置却没有问题

解决方法:可能是hadoop集群节点的主机名格式不对:一定不能包含‘_’下划线,切记。


异常3:

Could not locate executable null\bin\winutils.exe in the Hadoop binaries

分析:这是由于hadoop目前没有专门针对windows平台的支持包,导致Hadoop按照Linux的方式处理,导致异常

解决方法:org.apache.hadoop.util.Shell

 public static final String getQualifiedBinPath(String executable) 
  throws IOException {
    // construct hadoop bin path to the specified executable
    String fullExeName = HADOOP_HOME_DIR + File.separator + "bin" 
      + File.separator + executable;

    File exeFile = new File(fullExeName);
    if (!exeFile.exists()) {
      throw new IOException("Could not locate executable " + fullExeName
        + " in the Hadoop binaries.");
    }

    return exeFile.getCanonicalPath();
  }

 private static String checkHadoopHome() {

    // first check the Dflag hadoop.home.dir with JVM scope
    String home = System.getProperty("hadoop.home.dir");

    // fall back to the system/user-global env variable
    if (home == null) {
      home = System.getenv("HADOOP_HOME");
    }

    try {
       // couldn't find either setting for hadoop's home directory
       if (home == null) {
         throw new IOException("HADOOP_HOME or hadoop.home.dir are not set.");
       }

       if (home.startsWith("\"") && home.endsWith("\"")) {
         home = home.substring(1, home.length()-1);
       }

       // check that the home setting is actually a directory that exists
       File homedir = new File(home);
       if (!homedir.isAbsolute() || !homedir.exists() || !homedir.isDirectory()) {
         throw new IOException("Hadoop home directory " + homedir
           + " does not exist, is not a directory, or is not an absolute path.");
       }

       home = homedir.getCanonicalPath();

    } catch (IOException ioe) {
      if (LOG.isDebugEnabled()) {
        LOG.debug("Failed to detect a valid hadoop home directory", ioe);
      }
      home = null;
    }
    
    return home;
  }

将winutils.exe放到Hadoop安装包的bin目录,然后设置HADOOP_HOME环境变量或者通过System.setProperty("hadoop.home.dir", "D:\Tools\hadoop-2.7.3");方法设置系统属性

转载于:https://my.oschina.net/yangzhiwei256/blog/3014258

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值