由于hdfs文件系统中存在损坏的文件导致mapreducejob执行出错:
使用以下命令搜索HDFS文件没有正常关闭的文件:
hadoop fsck /logs/ -files -openforwrite | grep "MISSING"
将搜索到的文件写入文件:
hadoop fsck /logs/ -files -openforwrite | grep "MISSING" > test.txt
test.txt 内容如下:
test_1999-01-11-14.log.14b52615a49ZZ28520@test1.file 2573 bytes, 1 block(s), OPENFORWRITE: MISSING 1 blocks of total size 2573 B
test_1999-01-13-22.log.14b2ab99756ZZ18137@test2.file 897 bytes, 1 block(s), OPENFORWRITE: MISSING 1 blocks of total size 897 B
cat test.txt | awk -F ' ' '{print $1}'
[dataanalyze@n1 ~]$ cat a1.txt | awk -F ' ' '{print $1}'
test_1999-11-17-18.log.14b52615a49ZZ28520@test1.file
test_1999-01-04-22.log.14b2ab99756ZZ18137@test2.file
删除损坏的文件:
testt test.txt | awk -F ' ' '{print $1}' | xargs hdfs dfs -rm {}
使用以下命令搜索HDFS文件没有正常关闭的文件:
hadoop fsck /logs/ -files -openforwrite | grep "MISSING"
将搜索到的文件写入文件:
hadoop fsck /logs/ -files -openforwrite | grep "MISSING" > test.txt
test.txt 内容如下:
test_1999-01-11-14.log.14b52615a49ZZ28520@test1.file 2573 bytes, 1 block(s), OPENFORWRITE: MISSING 1 blocks of total size 2573 B
test_1999-01-13-22.log.14b2ab99756ZZ18137@test2.file 897 bytes, 1 block(s), OPENFORWRITE: MISSING 1 blocks of total size 897 B
cat test.txt | awk -F ' ' '{print $1}'
[dataanalyze@n1 ~]$ cat a1.txt | awk -F ' ' '{print $1}'
test_1999-11-17-18.log.14b52615a49ZZ28520@test1.file
test_1999-01-04-22.log.14b2ab99756ZZ18137@test2.file
删除损坏的文件:
testt test.txt | awk -F ' ' '{print $1}' | xargs hdfs dfs -rm {}