Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException

问题如下:

因为我要使用mapreduce操作hbase,所以我把hbase下所有的.jar文件都导入了eclipse下的mapreduce工程,在操作hbase时,遇到了下面的问题,弄了好久也不知道问题的所在,提示如下:

Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 176 actions: event_logs: 176 times, 

at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:192)
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:176)
at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:913)
at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:984)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1252)
at org.apache.hadoop.hbase.client.HTable.close(HTable.java:1289)
at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.close(TableOutputFormat.java:112)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:667)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:790)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)


17/06/29 20:59:13 INFO mapreduce.Job:  map 0% reduce 0%
17/06/29 20:59:32 INFO mapreduce.Job:  map 68% reduce 0%
17/06/29 20:59:35 INFO mapreduce.Job:  map 100% reduce 0%
17/06/29 20:59:35 INFO mapreduce.Job: Task Id : attempt_1498732164664_0003_m_000000_1, Status : FAILED
Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 176 actions: event_logs: 176 times, 
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:192)
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:176)
at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:913)
at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:984)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1252)
at org.apache.hadoop.hbase.client.HTable.close(HTable.java:1289)
at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.close(TableOutputFormat.java:112)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:667)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:790)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)


17/06/29 20:59:36 INFO mapreduce.Job:  map 0% reduce 0%
17/06/29 20:59:55 INFO mapreduce.Job:  map 43% reduce 0%
17/06/29 20:59:58 INFO mapreduce.Job:  map 100% reduce 0%
17/06/29 20:59:59 INFO mapreduce.Job: Task Id : attempt_1498732164664_0003_m_000000_2, Status : FAILED
Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 176 actions: event_logs: 176 times, 
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:192)
at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:176)
at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:913)
at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:984)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1252)
at org.apache.hadoop.hbase.client.HTable.close(HTable.java:1289)
at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.close(TableOutputFormat.java:112)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:667)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:790)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)


17/06/29 21:00:00 INFO mapreduce.Job:  map 0% reduce 0%
17/06/29 21:00:21 INFO mapreduce.Job:  map 100% reduce 0%
17/06/29 21:00:26 INFO mapreduce.Job: Job job_1498732164664_0003 failed with state FAILED due to: Task failed task_1498732164664_0003_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0


17/06/29 21:00:27 INFO mapreduce.Job: Counters: 9
Job Counters 
Failed map tasks=4
Launched map tasks=4
Other local map tasks=3
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=91969
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=91969
Total vcore-seconds taken by all map tasks=91969
Total megabyte-seconds taken by all map tasks=94176256

17/06/29 21:00:27 INFO etl.AnalyserLogDataRunner: 任务执行失败!

问题是出在建立的表上面,前面建立的表是(在 hbase shell下):create 'even_logs','info',然后我改为 :create 'event_logs','info'就没有出错了,所以应该是建立的表的family的名字应该和文件里的一样才行。在hbase shell 下看 mrtable的数据如下:


  • 1
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 4
    评论
评论 4
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

shujuboke

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值