localBlock在java_java.io.IOException:无法获取LocationBlock的块长度

在使用HDP2.1集群时,MapReduce作业遇到异常,导致任务失败。异常信息显示为`java.io.IOException:无法获取LocationBlock的块长度`。问题出现在尝试读取数据文件时,通过Hadoop日志分析,发现与HDFS输入流相关的问题。解决方案是运行`hdfs debug recoverLease -path -retries 3`命令。
摘要由CSDN通过智能技术生成

我正在使用HDP2.1。对于集群。我遇到了以下异常,因此MapReduce作业失败。实际上,我们经常使用flume的数据创建表,这是ver。1.4。我检查了mapper试图读取的数据文件,但在上面找不到任何东西。2014-11-28 00:08:28,696 WARN [main] org.apache.hadoop.metrics2.impl.MetricsConfig: Cannot locate

configuration: tried hadoop-metrics2-maptask.properties,hadoop-metrics2.properties

2014-11-28 00:08:28,947 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).

2014-11-28 00:08:28,947 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started

2014-11-28 00:08:28,995 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:

2014-11-28 00:08:29,009 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1417095534232_0051, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@ea23517)

2014-11-28 00:08:29,184 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.

2014-11-28 00:08:29,735 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /hadoop1/hadoop/yarn/local/usercache/xxx/appcache/application_1417095534232_0051,/hadoop2/hadoop/yarn/local/usercache/xxx/appcache/application_1417095534232_0051,/hadoop3/hadoop/yarn/local/usercache/xxx/appcache/application_1417095534232_0051,/hadoop4/hadoop/yarn/local/usercache/xxx/appcache/application_1417095534232_0051,/hadoop5/hadoop/yarn/local/usercache/xxx/appcache/application_1417095534232_0051,/hadoop6/hadoop/yarn/local/usercache/xxx/appcache/application_1417095534232_0051

2014-11-28 00:08:31,067 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id

2014-11-28 00:08:32,806 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : [ ]

2014-11-28 00:08:33,837 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: com.hadoop.mapred.DeprecatedLzoTextInputFormat:hdfs://cluster/apps/hive/external/mapp_log/dt=2014-11-27/mapp_parse_log.1417014001075:402653184+67311787

2014-11-28 00:08:34,196 INFO [main] org.apache.hadoop.hive.ql.log.PerfLogger:

2014-11-28 00:08:34,196 INFO [main] org.apache.hadoop.hive.ql.exec.Utilities: Deserializing MapWork via kryo

2014-11-28 00:08:35,222 INFO [main] org.apache.hadoop.hive.ql.log.PerfLogger:

2014-11-28 00:08:35,254 INFO [main] com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library

2014-11-28 00:08:35,260 INFO [main] com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev dbd51f0fb61f5347228a7a23fe0765ac1242fcdf]

2014-11-28 00:08:35,498 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.io.IOException: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-1879195946-xx.xx.xx.32-1409281631059:blk_1075462091_1722425; getBlockSize()=202923; corrupt=false; offset=469762048; locs=[xx.xx.xx.36:50010, xx.xx.xx.37:50010]}

at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)

at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)

at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)

at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:573)

at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTask.java:168)

at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:409)

at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)

at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Unknown Source)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)

at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Caused by: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-1879195946-xx.xx.xx.32-1409281631059:blk_1075462091_1722425; getBlockSize()=202923; corrupt=false; offset=469762048; locs=[xx.xx.xx.36:50010, xx.xx.xx.37:50010]}

at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:350)

at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:294)

at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:231)

at org.apache.hadoop.hdfs.DFSInputStream.(DFSInputStream.java:224)

at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1295)

at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:300)

at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:296)

at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:296)

at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)

at org.apache.hadoop.mapred.LineRecordReader.(LineRecordReader.java:108)

at org.apache.hadoop.mapred.TextInputFormat.getRecordReader(TextInputFormat.java:67)

at com.hadoop.mapred.DeprecatedLzoTextInputFormat.getRecordReader(DeprecatedLzoTextInputFormat.java:161)

at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:239)

... 9 more

2014-11-28 00:08:35,503 INFO [main] org.apache.hadoop.mapred.Task: Runnning cleanup for the task

如果你对这个问题有什么想法或解决办法,请与我分享…

谢谢你

广宇

最佳答案:

以下是对问题及其原因的详细描述:

https://community.hortonworks.com/answers/37414/view.html

对于我们来说,运行hdfs debug recoverLease -path -retries 3命令解决了这个问题。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值