HBase写HLog超时导致RegionServer退出

当hadoop的hdfs-site.xml中配置的dfs.socket.timeout的值比hbase中配置的大时, hbase在写入hlog时会报如下错误:

解决办法: 保证hadoop的hdfs-site.xml中配置的dfs.socket.timeout的值与HBase一致

10.9.141.165 RegionServer报错如下:

2013-04-15 01:05:49,476 WARN org.apache.hadoop.hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block blk_5280454841001477955_73253980java.net.SocketTimeoutException: 69000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/10.9.141.165:23420 remote=/10.9.141.165:50010]

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
    at java.io.DataInputStream.readFully(DataInputStream.java:178)
    at java.io.DataInputStream.readLong(DataInputStream.java:399)
    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2514)

2013-04-15 01:05:49,476 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73253980 bad datanode[0] 10.9.141.165:50010
2013-04-15 01:05:49,476 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73253980 in pipeline 10.9.141.165:50010, 10.9.141.152:50010, 10.9.141.158:50010: bad datanode 10.9.141.165:50010
2013-04-15 01:06:55,633 WARN org.apache.hadoop.hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block blk_5280454841001477955_73262690java.net.SocketTimeoutException: 66000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/10.9.141.165:41078 remote=/10.9.141.152:50010]

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
    at java.io.DataInputStream.readFully(DataInputStream.java:178)
    at java.io.DataInputStream.readLong(DataInputStream.java:399)
    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2514)

2013-04-15 01:06:55,634 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73262690 bad datanode[0] 10.9.141.152:50010
2013-04-15 01:06:55,634 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73262690 in pipeline 10.9.141.152:50010, 10.9.141.158:50010: bad datanode 10.9.141.152:50010
2013-04-15 01:07:58,716 WARN org.apache.hadoop.hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block blk_5280454841001477955_73262880java.net.SocketTimeoutExcept
ion: 63000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/10.9.141.165:48547 remote=/10.9.141.158:50010]

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
    at java.io.DataInputStream.readFully(DataInputStream.java:178)
    at java.io.DataInputStream.readLong(DataInputStream.java:399)
    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2514)

2013-04-15 01:07:58,718 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73262880 bad datanode[0] 10.9.141.158:50010

其中三台datanode报错如下:

10.9.141.152

2013-04-15 01:00:07,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73253980 src: /10.9.141.165:39523 dest: /10.9.141.152:50010

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_5280454841001477955_73253980 java.io.EOFException: while try

ing to read 65557 bytes

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73253980 1 Exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[connected local=/10.9.141.152:59490 remote=/10.9.141.158:50010]. 110927 millis timeout left.

    at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)

    at java.io.DataInputStream.readFully(DataInputStream.java:178)

    at java.io.DataInputStream.readLong(DataInputStream.java:399)

    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:868)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73253980 1 : Thread is interrupted.

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 1 for block blk_5280454841001477955_73253980 terminating

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73253980 received exception java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,474 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.152:50010, storageID=DS736845143, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Client calls recoverBlock(block=blk_5280454841001477955_73253980, targets=[10.9.141.152:50010, 10.9.141.158:50010])

2013-04-15 01:05:49,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_5280454841001477955_73253980(length=3121152), newblock=blk_5280454841001477955_73262690

(length=3121152), datanode=10.9.141.152:50010

2013-04-15 01:05:49,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73262690 src: /10.9.141.165:41078 dest: /10.9.141.152:50010

2013-04-15 01:05:49,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_5280454841001477955_73262690

2013-04-15 01:06:55,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73262690 1 Exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[connected local=/10.9.141.152:60943 remote=/10.9.141.158:50010]. 113932 millis timeout left.

    at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)

    at java.io.DataInputStream.readFully(DataInputStream.java:178)

    at java.io.DataInputStream.readLong(DataInputStream.java:399)

    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:868)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:06:55,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73262690 1 : Thread is interrupted.

2013-04-15 01:06:55,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 1 for block blk_5280454841001477955_73262690 terminating

2013-04-15 01:06:55,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73262690 received exception java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:06:55,632 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.152:50010, storageID=DS736845143, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_5280454841001477955_73253980(length=3121152), newblock=blk_5280454841001477955_73262690

(length=3121152), datanode=10.9.141.152:50010

2013-04-15 01:05:49,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73262690 src: /10.9.141.165:41078 dest: /10.9.141.152:50010

2013-04-15 01:05:49,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_5280454841001477955_73262690

10.9.141.158

2013-04-15 01:00:07,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73253980 src: /10.9.141.152:59490 dest: /10.9.141.158:50010

2013-04-15 01:05:49,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_5280454841001477955_73253980 java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73253980 Interrupted.

2013-04-15 01:05:49,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73253980 terminating

2013-04-15 01:05:49,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73253980 received exception java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,495 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.158:50010, storageID=DS2062116090, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_5280454841001477955_73253980(length=3121152), newblock=blk_5280454841001477955_73262690

(length=3121152), datanode=10.9.141.158:50010

2013-04-15 01:05:49,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73262690 src: /10.9.141.152:60943 dest: /10.9.141.158:50010

2013-04-15 01:05:49,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_5280454841001477955_73262690

2013-04-15 01:06:55,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73262690 Interrupted.

2013-04-15 01:06:55,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73262690 terminating

2013-04-15 01:06:55,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73262690 received exception java.io.EOFException: while trying to

read 65557 bytes

2013-04-15 01:06:55,652 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.158:50010, storageID=DS2062116090, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:06:55,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Client calls recoverBlock(block=blk_5280454841001477955_73262690, targets=[10.9.141.158:50010])

2013-04-15 01:06:55,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_5280454841001477955_73262690(length=3121152), newblock=blk_5280454841001477955_73262880(length=3121152), datanode=10.9.141.158:50010

2013-04-15 01:06:55,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73262880 src: /10.9.141.165:48547 dest: /10.9.141.158:50010

2013-04-15 01:06:55,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_5280454841001477955_73262880

2013-04-15 01:07:58,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73262880 Interrupted.

2013-04-15 01:07:58,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73262880 terminating

2013-04-15 01:07:58,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73262880 received exception java.io.EOFException: while trying to

read 65557 bytes

2013-04-15 01:07:58,735 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.158:50010, storageID=DS2062116090, infoPort=50075, ipcPort=50020):Dat

aXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

10.9.141.165

2013-04-15 01:00:07,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73253980 src: /10.9.141.165:23420 dest: /10.9.141.165:50010

2013-04-15 01:05:49,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_5280454841001477955_73253980 java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73253980 2 Exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[connected local=/10.9.141.165:39523 remote=/10.9.141.152:50010]. 290930 millis timeout left.

    at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)

    at java.io.DataInputStream.readFully(DataInputStream.java:178)

    at java.io.DataInputStream.readLong(DataInputStream.java:399)

    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:868)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73253980 2 : Thread is interrupted.

2013-04-15 01:05:49,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 2 for block blk_5280454841001477955_73253980 terminating

2013-04-15 01:05:49,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73253980 received exception java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,478 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.165:50010, storageID=DS-1327849832, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值