java.lang.OutOfMemoryError: unable to create new native thread

35227 2014-05-21 13:53:18,504 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_8901346392456488003_201326
135228 2014-05-21 13:53:18,506 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.1.33.11:50010, storageID=DS-420686803-10.1.33.11-50010-1399181350037, infoPort=50 075, ipcPort=50020):DataXceiver
135229 java.lang.OutOfMemoryError: unable to create new native thread
135230 at java.lang.Thread.start0(Native Method)
135231 at java.lang.Thread.start(Thread.java:714)
135232 at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:576)
135233 at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:404)
135234 at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
135235 at java.lang.Thread.run(Thread.java:745)
135236 2014-05-21 13:53:25,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.1.33.11:50010, storageID=DS-420686803-10.1.33.11-50010-1399181350037, infoPort=500 75, ipcPort=50020) Starting thread to transfer blk_-2263414036967179224_200863 to 10.1.33.13:50010
135237 2014-05-21 13:53:25,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.1.33.11:50010, storageID=DS-420686803-10.1.33.11-50010-1399181350037, infoPort=500 75, ipcPort=50020) Starting thread to transfer blk_-2231241119796918398_200963 to 10.1.33.15:50010
135238 2014-05-21 13:53:25,877 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.1.33.11:50010, storageID=DS-420686803-10.1.33.11-50010-1399181350037, infoPort=500 75, ipcPort=50020):Failed to transfer blk_-2263414036967179224_200863 to 10.1.33.13:50010 got java.net.ConnectException: Connection refused
135239 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
135240 at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
135241 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
135242 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
135243 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
135244 at org.apache.hadoop.hdfs.server.datanode.DataNode$DataTransfer.run(DataNode.java:1511)
135245 at java.lang.Thread.run(Thread.java:745)
135246
135247 2014-05-21 13:53:25,877 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: checkDiskError: exception:
135248 java.net.ConnectException: Connection refused
135249 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
135250 at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
135251 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
135252 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
135253 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
135254 at org.apache.hadoop.hdfs.server.datanode.DataNode$DataTransfer.run(DataNode.java:1511)
135255 at java.lang.Thread.run(Thread.java:745)
135256 2014-05-21 13:53:25,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Not checking disk as checkDiskError was called on a network related exception
135257 2014-05-21 13:53:25,877 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode


ulimit -u 参数调大
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值