org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried hadoop集群启动异常
问题描述
启动hadoop集群后,在节点服务器,查看hadoop的datanode日志,发现一直在尝试重新连接
1.报错信息##
2024-07-22 11:08:59,436 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:00,437 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 6
time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:01,439 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:02,441 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:03,442 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:34,447 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:35,448 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:36,450 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:37,452 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:38,454 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:39,456 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2024-07-22 11:09:40,458 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8031. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2.解决方法:
1).关闭防火墙
1:查看防火状态
systemctl status firewalld
service iptables status
2:暂时关闭防火墙
systemctl stop firewalld
service iptables stop
3:永久关闭防火墙
systemctl disable firewalld
chkconfig iptables off
4:重启防火墙
systemctl enable firewalld
service iptables restart
5:永久关闭后重启
//暂时还没有试过
chkconfig iptables on
2).修改yarn-site.xml
<property>
<name>yarn.resourcemanager.address</name>
<value>hadoop103:8032</value>
</property>
<property>
<name>yarn.resourcemanager.scheduler.address</name>
<value>hadoop103:8030</value>
</property>
<property>
<name>yarn.resourcemanager.resource-tracker.address</name>
<value>hadoop103:8031</value>
</property>
注意:hadoop103要替换成主节点 IP/主机名
通过scp 分发到子节点服务器 例:scp -r /hadoop-3.2.2/etc/hadoop/yarn-site.xml
#分发示例
scp -r /hadoop-3.2.2/etc/hadoop/yarn-site.xml root@hadoop103:/hadoop3.2.2/etc/hadoop/
3).在主节点重启hadoop
#示例
#关闭 dfs
[root@node1 hadoop-3.2.2]# ./sbin/stop-dfs.sh
#关闭yarn
[root@node1 hadoop-3.2.2]# ./sbin/stop-yarn.sh
#或者使用关闭集群方式
[root@node1 hadoop-3.2.2]# ./sbin/stop-all.sh
#启动dfs
[root@node1 hadoop-3.2.2]# ./sbin/start-dfs.sh
#启动yarn
[root@node1 hadoop-3.2.2]# ./sbin/start-yarn.sh
#或者使用启动集群方式
[root@node1 hadoop-3.2.2]# ./sbin/start-all.sh