昨天程序运行的时候,办公室突然停电了。 来电后以后再运行程序出现了spark程序跑不起来的问题。 查看log情况为:
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /log. Name node is in safe mode.
The reported blocks 339192 needs additional 120824 blocks to reach the threshold 0.9990 of total blocks 460477.
The minimum number of live datanodes is not required. Safe mode will be turned off automatically once the thresholds have been reached. NamenodeHostName:master
解决过程:
1. 运行hadoop fsck / -files检查hdfs文件
检查最后显示文件系统corrupt了。
Status: corrupt
2. 运行hadoop fsck / -delete清除损坏的文件
3.再执行运行hadoop fsck / -files检查hdfs文件,文件系统好了。
Status: HEALTHY
Number of data-nodes: 3
Number of racks: 1
Total dirs: 107581
Total symlinks: 0
注意: 这种方式会出现数据丢失,损坏的block会被删掉。如果找不到更好的修复办法,只能用这个删除再重建了。