刚学MapReduce.
写WordCount的代码(代码没问题的情况下) , 运行后报错:
错误代码
19/11/12 07:57:19 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
19/11/12 07:57:19 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
19/11/12 07:57:19 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
19/11/12 07:57:19 INFO mapred.JobClient: Cleaning up the staging area file:/tmp/hadoop-root/mapred/staging/okakio637697353/.staging/job_local637697353_0001
19/11/12 07:57:19 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:0: No such file or directory
Exception in thread "main" 0: No such file or directory
在百度搜索后,有以下解决方案:
1:
错误原因:mapred-site.xml配置有问题:
正确配置如下
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
试过之后不行
2:
每个NM的core-site.xml设置下下面的属性就可以了
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
试过之后不行
3:
加入各种代码:
如:System.setProperty("HADOOP_USER_NAME","root");
,
conf.set("dfs.permissions","false");
运行后依然报错
最终解决方案:
打开everything搜索/tmp/hadoop-root/
,把这个tmp文件移到hadoop相同的磁盘下,再次运行即可成功
备注: 可能有人用户名是中文的 , 就会导致乱码的情况, Hadoop不会帮你创建tmp目录, 在代码里加入: conf.set(“hadoop.tmp.dir”,“自己的路径名”)
如:conf.set(“hadoop.tmp.dir”,“E:\tmp\hadoop-abc”);