hdfs存储过程遇到的问题 执行这个命令hadoop fs -put jdk-8u141-linux-x64.tar.gz /

19/07/12 21:17:51 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.226.134:50010
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:140)
at org.apache.hadoop.hdfs.DFSOutputStream D a t a S t r e a m e r . c r e a t e B l o c k O u t p u t S t r e a m ( D F S O u t p u t S t r e a m . j a v a : 1359 ) a t o r g . a p a c h e . h a d o o p . h d f s . D F S O u t p u t S t r e a m DataStreamer.createBlockOutputStream(DFSOutputStream.java:1359) at org.apache.hadoop.hdfs.DFSOutputStream DataStreamer.createBlockOutputStream(DFSOutputStream.java:1359)atorg.apache.hadoop.hdfs.DFSOutputStreamDataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262)
at org.apache.hadoop.hdfs.DFSOutputStream D a t a S t r e a m e r . r u n ( D F S O u t p u t S t r e a m . j a v a : 448 ) 19 / 07 / 1221 : 17 : 51 I N F O h d f s . D F S C l i e n t : A b a n d o n i n g B P − 1026157008 − 192.168.226.133 − 1562929611969 : b l k 1 07374183 9 1 01519 / 07 / 1221 : 17 : 51 I N F O h d f s . D F S C l i e n t : E x c l u d i n g d a t a n o d e D a t a n o d e I n f o W i t h S t o r a g e [ 192.168.226.134 : 50010 , D S − 1 a d a c d 4 a − 213 c − 4 c a 5 − 9 e 7 a − 1 f 3 f 5145 a 69 e , D I S K ] 19 / 07 / 1221 : 17 : 52 I N F O h d f s . D F S C l i e n t : E x c e p t i o n i n c r e a t e B l o c k O u t p u t S t r e a m j a v a . i o . I O E x c e p t i o n : G o t e r r o r , s t a t u s m e s s a g e , a c k w i t h f i r s t B a d L i n k a s 192.168.226.135 : 50010 a t o r g . a p a c h e . h a d o o p . h d f s . p r o t o c o l . d a t a t r a n s f e r . D a t a T r a n s f e r P r o t o U t i l . c h e c k B l o c k O p S t a t u s ( D a t a T r a n s f e r P r o t o U t i l . j a v a : 140 ) a t o r g . a p a c h e . h a d o o p . h d f s . D F S O u t p u t S t r e a m DataStreamer.run(DFSOutputStream.java:448) 19/07/12 21:17:51 INFO hdfs.DFSClient: Abandoning BP-1026157008-192.168.226.133-1562929611969:blk_1073741839_1015 19/07/12 21:17:51 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.226.134:50010,DS-1adacd4a-213c-4ca5-9e7a-1f3f5145a69e,DISK] 19/07/12 21:17:52 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.226.135:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:140) at org.apache.hadoop.hdfs.DFSOutputStream DataStreamer.run(DFSOutputStream.java:448)19/07/1221:17:51INFOhdfs.DFSClient:AbandoningBP1026157008192.168.226.1331562929611969:blk1073741839101519/07/1221:17:51INFOhdfs.DFSClient:ExcludingdatanodeDatanodeInfoWithStorage[192.168.226.134:50010,DS1adacd4a213c4ca59e7a1f3f5145a69e,DISK]19/07/1221:17:52INFOhdfs.DFSClient:ExceptionincreateBlockOutputStreamjava.io.IOException:Goterror,statusmessage,ackwithfirstBadLinkas192.168.226.135:50010atorg.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:140)atorg.apache.hadoop.hdfs.DFSOutputStreamDataStreamer.createBlockOutputStream(DFSOutputStream.java:1359)
at org.apache.hadoop.hdfs.DFSOutputStream D a t a S t r e a m e r . n e x t B l o c k O u t p u t S t r e a m ( D F S O u t p u t S t r e a m . j a v a : 1262 ) a t o r g . a p a c h e . h a d o o p . h d f s . D F S O u t p u t S t r e a m DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) at org.apache.hadoop.hdfs.DFSOutputStream DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262)atorg.apache.hadoop.hdfs.DFSOutputStreamDataStreamer.run(DFSOutputStream.java:448)
19/07/12 21:17:52 INFO hdfs.DFSClient: Abandoning BP-1026157008-192.168.226.133-1562929611969:blk_1073741840_1016
19/07/12 21:17:52 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.226.135:50010,DS-fdc495fe-a7c7-44a7-940d-e07c0680c07b,DISK

原因是没有关闭放火墙

第一步关闭防火墙:

iptables -F

第二步永久关闭防火墙:

chkconfig iptables off

第三步禁用selinux:

setenforce 0

第四步永久关闭selinux:

vi /etc/selinux/config文件

设置为“SELINUX=disabled”

  • 2
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
要在Linux上安装和部署Hadoop,可以按照以下步骤进行操作: 1. 首先,使用wget命令下载Hadoop安装包。可以使用以下命令Hadoop安装包下载到指定目录: ``` wget -P /home/cent/Downloads https://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-2.8.5/hadoop-2.8.5.tar.gz ``` 2. 如果你是在虚拟机上进行安装,你也可以使用sftp命令从Windows上传Hadoop文件到Linux。具体的传输文件命令可以根据你的实际情况进行调整。 3. 解压Hadoop安装包。可以使用以下命令解压下载的Hadoop安装包: ``` tar -zxvf /home/cent/Downloads/hadoop-2.8.5.tar.gz -C /opt ``` 4. 上传文件到HDFS根目录。可以使用以下命令将文件上传到HDFS根目录: ``` /opt/hadoop-2.8.5/bin/hadoop fs -put /kingyifan/hadoop/hadoop-2.7.7/test.txt / ``` 5. 进入Hadoop配置文件目录。可以使用以下命令进入Hadoop配置文件目录: ``` cd /kingyifan/hadoop/hadoop-2.7.7/etc/hadoop/ ``` 6. 修改Hadoop环境变量配置文件。可以使用以下命令编辑hadoop-env.sh文件: ``` vim hadoop-env.sh ``` 7. 在hadoop-env.sh文件中,修改JAVA_HOME变量为你的JDK安装目录。可以使用以下命令将JAVA_HOME设置为指定目录: ``` export JAVA_HOME=/DATA/jdk/jdk1.8.0_211 ``` 请根据你的实际情况进行相应的调整和配置。以上步骤可以帮助你在Linux上安装和部署Hadoop。\[1\]\[2\]\[3\] #### 引用[.reference_title] - *1* [Linux安装hadoop](https://blog.csdn.net/weixin_44153121/article/details/85248465)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* *3* [Linux安装部署Hadoop及统计单词次数测试](https://blog.csdn.net/weixin_39984161/article/details/93489801)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值