hadoop 0.0.0.0/0.0.0.0:10020 connection refused

本文介绍在Hadoop2.7.1环境中执行MapReduce程序时遇到连接异常问题的解决方案,包括配置文件mapred-site.xml的正确设置及启动MapReduce JobHistoryServer服务的方法。

出现问题描述:

在hadoop2.7.1 中执行MapReduce程序时出现下面的错误.

14/04/04 17:32:34 INFOmapred.ClientServiceDelegate: Application state is completed.FinalApplicationStatus=SUCCEEDED. Redirecting to job history server
14/04/04 17:32:35 INFO ipc.Client: Retryingconnect to server: 0.0.0.0/0.0.0.0:10020. Already tried 0 time(s); retry policyis RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/04/04 17:32:36 INFO ipc.Client: Retryingconnect to server: 0.0.0.0/0.0.0.0:10020. Already tried 1 time(s); retry policyis RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
...
java.io.IOException:java.net.ConnectException: Call From master/192.168.182.100 to 0.0.0.0:10020failed on connection exception: java.net.ConnectException: Connection refused;For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
         atorg.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:334)
         atorg.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:419)
         atorg.apache.hadoop.mapred.YARNRunner.getJobStatus(YARNRunner.java:532)
         atorg.apache.hadoop.mapreduce.Job$1.run(Job.java:314)
         atorg.apache.hadoop.mapreduce.Job$1.run(Job.java:311)
         atjava.security.AccessController.doPrivileged(Native Method)
         atjavax.security.auth.Subject.doAs(Subject.java:415)
         atorg.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
         ...


 

从错误信息中需要访问端口 10020 可以大概看出,DataNode 需要访问 MapReduce JobHistory Server,如果没有修改则用默认值:0.0.0.0:10020 。需要修改配置文件 mapred-site.xml  :

 

<property>
   <name>mapreduce.jobhistory.address</name>
   <!-- 配置实际的主机名和端口-->
   <value>master:10020</value>
</property>


并且启动MapReduce JobHistory Server服务:

在namenode上执行命令:

sbin/mr-jobhistory-daemon.sh start historyserver


这样在,namenode上会启动JobHistoryServer服务,可以在historyserver的日志中查看运行情况


参考地址:http://blog.csdn.net/tangzwgo/article/details/25893989

http://www.micmiu.com/bigdata/hive/hive-connect-exception-to-10020/


hadoop jar /opt/program/hadoop-3.1.3/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar wordcount hdfs://master:8020/input hdfs://master://8020/output/wc 2025-03-13 20:50:45,629 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 java.net.ConnectException: Call From master/192.168.107.101 to master:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:755) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549) at org.apache.hadoop.ipc.Client.call(Client.java:1491) at org.apache.hadoop.ipc.Client.call(Client.java:1388) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:904) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandl
03-14
[root@hadoop102 hive-3.1.2]# hive which: no hbase in (/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/opt/module/jdk1.8.0_212/bin:/opt/module/hadoop-3.1.3/bin:/opt/module/hadoop-3.1.3/sbin:/opt/module/hive-3.1.2/bin:/home/lijiedong/.local/bin:/home/lijiedong/bin:/opt/module/jdk1.8.0_212/bin:/opt/module/hadoop-3.1.3/bin:/opt/module/hadoop-3.1.3/sbin:/opt/module/hive-3.1.2/bin:/opt/module/jdk1.8.0_212/bin:/opt/module/hadoop-3.1.3/bin:/opt/module/hadoop-3.1.3/sbin:/opt/module/hive-3.1.2/bin) Hive Session ID = e00709cb-553a-4ac4-b263-358b256ccc3d Logging initialized using configuration in jar:file:/opt/module/hive-3.1.2/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true Exception in thread "main" java.lang.RuntimeException: java.net.ConnectException: Call From hadoop102/192.168.10.102 to hadoop102:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:651) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:591) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:747) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:318) at org.apache.hadoop.util.RunJar.main(RunJar.java:232) Caused by: java.net.ConnectException: Call From hadoop102/192.168.10.102 to hadoop102:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:755) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549) at org.apache.hadoop.ipc.Client.call(Client.java:1491) at org.apache.hadoop.ipc.Client.call(Client.java:1388) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy28.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:904) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy29.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1661) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1577) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1574) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1589) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1683) at org.apache.hadoop.hive.ql.exec.Utilities.ensurePathIsWritable(Utilities.java:4486) at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:760) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:701) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:627) ... 9 more Caused by: java.net.ConnectException: 拒绝连接 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:700) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:804) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:421) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1606) at org.apache.hadoop.ipc.Client.call(Client.java:1435) ... 34 more 如何解决
09-19
java.net.ConnectException: Call From meng/192.168.134.83 to 0.0.0.0:8032 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.GeneratedConstructorAccessor5.newInstance(Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) at org.apache.hadoop.ipc.Client.call(Client.java:1480) at org.apache.hadoop.ipc.Client.call(Client.java:1413) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy13.getNewApplication(Unknown Source) at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:221) at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy14.getNewApplication(Unknown Source) at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplication(YarnClientImpl.java:219) at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:227) at org.apache.hadoop.mapred.ResourceMgrDelegate.getNewJobID(ResourceMgrDelegate.java:187) at org.apache.hadoop.mapred.YARNRunner.getNewJobID(YARNRunner.java:241) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:153) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) at org.apache.hadoop.ma
06-20
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值