Hadoop之——Could not locate executable null 解决办法

文章来自http://www.aboutyun.com/thread-8030-1-1.html

问题导读:

1.建一个MapReduce Project,运行时发现出问题:Could not locate executable null,该如何解决?
2.Could not locate executabl …hadoop-2.2.0hadoop-2.2.0inwinutils.exe in the Hadoop binaries.该如何解决?

1.创建一个MapReduce Project,运行时发现出问题了。

java.io.IOException: Could not locate executable nullinwinutils.exe in the Hadoop binaries.

跟代码就去发现是HADOOP_HOME的问题。如果HADOOP_HOME为空,必然fullExeName为nullinwinutils.exe。解决方法很简单啦,乖乖的配置环境变量吧,不想重启电脑可以在MapReduce程序里加上System.setProperty(“hadoop.home.dir”, “…”);暂时缓缓。org.apache.hadoop.util.Shell.java

public static final String getQualifiedBinPath(String executable) 
  throws IOException {
    // construct hadoop bin path to the specified executable
    String fullExeName = HADOOP_HOME_DIR + File.separator + "bin" 
      + File.separator + executable;

    File exeFile = new File(fullExeName);
    if (!exeFile.exists()) {
      throw new IOException("Could not locate executable " + fullExeName
        + " in the Hadoop binaries.");
    }

    return exeFile.getCanonicalPath();
  }

private static String HADOOP_HOME_DIR = checkHadoopHome();
private static String checkHadoopHome() {

    // first check the Dflag hadoop.home.dir with JVM scope
    String home = System.getProperty("hadoop.home.dir");

    // fall back to the system/user-global env variable
    if (home == null) {
      home = System.getenv("HADOOP_HOME");
    }
     ...
}
  1. 这个时候得到完整的地址fullExeName,我机器上是D:Hadoop arhadoop-2.2.0hadoop-2.2.0inwinutils.exe。继续执行代码又发现了错误

Could not locate executable D:Hadoop arhadoop-2.2.0hadoop-2.2.0inwinutils.exe in the Hadoop binaries.

就去一看,没有winutils.exe这个东西。去https://github.com/srccodes/hadoop-common-2.2.0-bin下载一个,放就去即可。
3. 继续出问题

at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:435)

继续跟代码org.apache.hadoop.util.Shell.java

public static String[] getSetPermissionCommand(String perm, boolean recursive,
                                                 String file) {
    String[] baseCmd = getSetPermissionCommand(perm, recursive);
    String[] cmdWithFile = Arrays.copyOf(baseCmd, baseCmd.length + 1);
    cmdWithFile[cmdWithFile.length - 1] = file;
    return cmdWithFile;
  }

  /** Return a command to set permission */
  public static String[] getSetPermissionCommand(String perm, boolean recursive) {
    if (recursive) {
      return (WINDOWS) ? new String[] { WINUTILS, "chmod", "-R", perm }
                         : new String[] { "chmod", "-R", perm };
    } else {
      return (WINDOWS) ? new String[] { WINUTILS, "chmod", perm }
                       : new String[] { "chmod", perm };
    }
  }

cmdWithFile数组的内容为{“D:Hadoop arhadoop-2.2.0hadoop-2.2.0inwinutils.exe”, “chmod”, “755”, “xxxfile”},我把这个单独在cmd里执行了一下,发现

无法启动此程序,因为计算机中丢失 MSVCR100.dll

那就下载一个呗http://files.cnblogs.com/sirkevin/msvcr100.rar,丢到C:WindowsSystem32里面。再次cmd执行,又来了问题

应用程序无法正常启动(0xc000007b)

下载 http://blog.csdn.net/vbcom/article/details/7245186 DirectX_Repair来解决这个问题吧。记得修复完后要重启电脑。搞定后cmd试一下,很棒。
4. 到了这里,已经看到曙光了,但问题又来了

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

代码就去

   /** Windows only method used to check if the current process has requested*  access rights on the given path. */
      private static native boolean access0(String path, int requestedAccess);

显然缺少dll文件,还记得 https://github.com/srccodes/hadoop-common-2.2.0-bin 下载的东西吧,里面就有hadoop.dll,最好的方法就是用hadoop-common-2.2.0-bin-master/bin目录替换本地hadoop的bin目录,并在环境变量里配置PATH=HADOOP_HOME/bin,重启电脑。

1.下载地址


hadoop家族、strom、spark、Linux、flume等jar包、安装包汇总下载(持续更新)
注意的问题:
环境变量一定配置正确,否则还是不能运行
PATH=HADOOP_HOME/bin,如果这个不行,可以换成绝对路径

5.终于看到了MapReduce的正确输出output99。

总结

  • hadoop eclipse插件不是必须的,其作用在我看来就是如下三点(这个是一个错误的认识,具体请参考http://zy19982004.iteye.com/blog/2031172)。study-hadoop是一个普通project,直接运行(不通过Run on Hadoop这只大象),一样可以调试到MapReduce。

      • 对hadoop中的文件可视化。
      • 创建MapReduce Project时帮你引入依赖的jar。
      • Configuration conf = new Configuration();时就已经包含了所有的配置信息。
  • 还是自己下载hadoop2.2的源码编译好,应该是不会有任何问题的(没有亲测)。

六. 其它问题
还是

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

代码跟到org.apache.hadoop.util.NativeCodeLoader.java去看

static {
    // Try to load native hadoop library and set fallback flag appropriately
    if(LOG.isDebugEnabled()) {
      LOG.debug("Trying to load the custom-built native-hadoop library...");
    }
    try {
      System.loadLibrary("hadoop");
      LOG.debug("Loaded the native-hadoop library");
      nativeCodeLoaded = true;
    } catch (Throwable t) {
      // Ignore failure to load
      if(LOG.isDebugEnabled()) {
        LOG.debug("Failed to load native-hadoop with error: " + t);
        LOG.debug("java.library.path=" +
            System.getProperty("java.library.path"));
      }
    }
    
    if (!nativeCodeLoaded) {
      LOG.warn("Unable to load native-hadoop library for your platform... " +
               "using builtin-java classes where applicable");
    }
  }

这里报错如下

DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: HADOOP_HOMEinhadoop.dll: Can't load AMD 64-bit .dll on a IA 32-bit platform

怀疑是32位jdk的问题,替换成64位后,没问题了

    2014-03-11 19:43:08,805 DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
    2014-03-11 19:43:08,812 DEBUG org.apache.hadoop.util.NativeCodeLoader - Loaded the native-hadoop library

这里也解决了一个常见的警告

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

e

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值