hadoop常见问题解决

1.在对文件进行删除操作的时候,报SafeModeException异常

rm: Multiple IOExceptions: [org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete /user/node/input/test.xml. Name node is in safe mode.

解决方式:

hadoop dfsadmin -safemode leave  

 该命令是让hadoop离开安全模式

 

 

2.在执行计算任务的时候抛出ShuffleError异常,信息如下:

 

org.apache.hadoop.mapreduce.task.reduce.Shuffle$ShuffleError: error in shuffle in fetcher#5

at org.apache.hadoop.mapreduce.task.reduce.Shuffle.run(Shuffle.java:124)

at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:362)

at org.apache.hadoop.mapred.Child$4.run(Child.java:217)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:416)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:742)

at org.apache.hadoop.mapred.Child.main(Child.java:211)

Caused by: java.io.IOException: Exceeded MAX_FAILED_UNIQUE_FETCHES; bailing-out.

at org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler.checkReducerHealth(ShuffleScheduler.java:253)

at org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler.copyFailed(ShuffleScheduler.java:187)

at org.apache.hadoop.mapreduce.task.reduce.Fetcher.copyFromHost(Fetcher.java:227)

at org.apache.hadoop.mapreduce.task.reduce.Fetcher.run(Fetcher.java:149)

解决方式:
问题出在hosts文件的配置上,在所有节点的/etc/hosts文件中加入其他节点的主机名和IP映射。如下:
192.168.137.10  name-node
192.168.137.11  node-1
192.168.137.12  node-2
192.168.137.13  node-3
192.168.137.14  node-4
192.168.137.15  node-5
 

 

 

(此文以后会继续更新)

阅读更多
文章标签: hadoop delete 任务
个人分类: hadoop
上一篇hadoop集群环境安装中的hosts 配置问题
下一篇使用LVS的时候碰到的一些问题
想对作者说点什么? 我来说一句

没有更多推荐了,返回首页

关闭
关闭
关闭