hbase报的错误,经过分析是Hadoop不能写入数据了。可恶的是hbase会丢失该阶段put的的数据。
DFSClient: Exception in createBlockOutputStream java.io.IOException: Bad connect ack with firstBadLink
DFSClient: Abandoning block blk_
DFSClient: Waiting to find target node
DFSClient: DataStreamer Exception: java.io.IOException: Unable to create new block
WARN hdfs.DFSClient: Error Recovery for block blk_-4764663209577220400_66475 bad datanode[1] nodes == null
WARN hdfs.DFSClient: Could not get block locations. Source file "..." - Aborting......
put: Bad connect ack with firstBadLink
ERROR hdfs.DFSClient: Exception closing file .... : java.io.IOException: Bad connect ack with firstBadLink ...
java.io.IOException: Bad connect ack with firstBadLink...
原因:
1、某个节点机器突然开启防火墙,导致不能连接
2、强制kill掉某个节点(据说)
3、某个机器直接当掉,岂不是完蛋了?数据冗余容灾的设计干什么吃的?(推测)