14/09/10 15:42:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/09/10 15:42:55 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-hadoop\mapred\staging\hadoop-1275750701\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-hadoop\mapred\staging\hadoop-1275750701\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:918)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at hadoop.run.HadoopMain.run(HadoopMain.java:43)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at hadoop.run.HadoopMain.main(HadoopMain.java:89)
解决方案:
程序打成jar包可以在Hadoop中运行,但是在eclipse中出现上述问题。此问题是window下的权限问题,因为在window下是虚拟的Hadoop运行环境,所以解决方式也比较粗暴。
将window下的/hadoop-1.0.2/src/core/org/apache/hadoop/fs/FileUtil.java里面的checkReturnValue方法体注释掉如下:
private static void checkReturnValue(boolean rv, File p,
FsPermission permission
) throws IOException {
/**
if (!rv) {
throw new IOException("Failed to set permissions of path: " + p +
" to " +
String.format("%04o", permission.toShort()));
}
**/
} 重新编译打包hadoop-core-1.0.2.jar,替换掉hadoop-1.0.2根目录下的hadoop-core-1.0.2.jar即可 不会编译hadoop-core-1.0.2.jar的可以采用另一种方式,仅编译/hadoop-1.0.2/src/core/org/apache/hadoop/fs/FileUtil.java即可;步骤如下: 使用eclipse工具,新建一个MapReduce项目;然后将/hadoop-1.0.2/src/core下面的org文件夹以及org下所以子目录复制到MapReduce项目的src下找到
/hadoop-1.0.2/src/core/org/apache/hadoop/fs/FileUtil.java按照上述修改更改保存后,到workspace下面找到该项目编译后的.class文件(FileUtil$CygPathCommand.class和
FileUtil.class),然后找到hadoop-core-1.0.2.jar,双击打开,将这个文件替换即可。