hadoop namenode启动不了 浏览器无法访问

最近在学hadoop,记录一下遇到的坑,基本上是配置文件的问题,导致节点起不起来

使用sbin/hadoop-daemon.sh start namenode启动hadoop显示

WARNING: Use of this script to start HDFS daemons is deprecated.
WARNING: Attempting to execute replacement "hdfs --daemon start" instead.

说是不赞成使用这个命令启动namanode,尝试替换成 hdfs --daemon start 的方式
虽然不知道什么意思,那就先执行一下吧

使用bin/hdfs namenode报错

2019-12-24 21:37:15,709 ERROR namenode.NameNode: Failed to start namenode.
java.net.SocketException: Call From hadoop101 to null:0 failed on socket exception: java.net.SocketException: Unresolved address; For more details see:  http://wiki.apache.org/hadoop/SocketException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:833)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:800)
        at org.apache.hadoop.ipc.Server.bind(Server.java:620)
        at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:1184)
        at org.apache.hadoop.ipc.Server.<init>(Server.java:3066)
        at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:1039)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server.<init>(ProtobufRpcEngine.java:426)
        at org.apache.hadoop.ipc.ProtobufRpcEngine.getServer(ProtobufRpcEngine.java:347)
        at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:848)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.<init>(NameNodeRpcServer.java:460)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createRpcServer(NameNode.java:799)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:713)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:953)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:926)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1692)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1759)
Caused by: java.net.SocketException: Unresolved address
        at sun.nio.ch.Net.translateToSocketException(Net.java:131)
        at sun.nio.ch.Net.translateException(Net.java:157)
        at sun.nio.ch.Net.translateException(Net.java:163)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:76)
        at org.apache.hadoop.ipc.Server.bind(Server.java:603)
        ... 13 more
Caused by: java.nio.channels.UnresolvedAddressException
        at sun.nio.ch.Net.checkAddress(Net.java:101)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        ... 14 more
2019-12-24 21:37:15,713 INFO util.ExitUtil: Exiting with status 1: java.net.SocketException: Call From hadoop101 to null:0 failed on socket exception: java.net.SocketException: Unresolved address; For more details see:  http://wiki.apache.org/hadoop/SocketException
2019-12-24 21:37:15,715 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at localhost/127.0.0.1
************************************************************/

jps检查namenode没成功启动,错误大概就是hadoop101这个地址找不到

vim etc/hadoop/hdfs-site.xml添加web地址

<property>
		<name>dfs.namenode.http-address</name>
		<value>hadoop101:50070</value>
</property>

再次启动报错

java.net.SocketException: Unresolved address
        at sun.nio.ch.Net.translateToSocketException(Net.java:131)
        at sun.nio.ch.Net.translateException(Net.java:157)
        at sun.nio.ch.Net.translateException(Net.java:163)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:76)
        at org.eclipse.jetty.server.ServerConnector.openAcceptChannel(ServerConnector.java:351)
        at org.eclipse.jetty.server.ServerConnector.open(ServerConnector.java:319)
        at org.apache.hadoop.http.HttpServer2.bindListener(HttpServer2.java:1205)
        at org.apache.hadoop.http.HttpServer2.bindForSinglePort(HttpServer2.java:1236)
        at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:1299)
        at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1154)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:181)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:885)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:707)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:953)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:926)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1692)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1759)
Caused by: java.nio.channels.UnresolvedAddressException
        at sun.nio.ch.Net.checkAddress(Net.java:101)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        ... 13 more
2019-12-24 21:58:48,999 INFO impl.MetricsSystemImpl: Stopping NameNode metrics system...
2019-12-24 21:58:49,000 INFO impl.MetricsSystemImpl: NameNode metrics system stopped.
2019-12-24 21:58:49,000 INFO impl.MetricsSystemImpl: NameNode metrics system shutdown complete.
2019-12-24 21:58:49,000 ERROR namenode.NameNode: Failed to start namenode.
java.net.SocketException: Unresolved address
        at sun.nio.ch.Net.translateToSocketException(Net.java:131)
        at sun.nio.ch.Net.translateException(Net.java:157)
        at sun.nio.ch.Net.translateException(Net.java:163)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:76)
        at org.eclipse.jetty.server.ServerConnector.openAcceptChannel(ServerConnector.java:351)
        at org.eclipse.jetty.server.ServerConnector.open(ServerConnector.java:319)
        at org.apache.hadoop.http.HttpServer2.bindListener(HttpServer2.java:1205)
        at org.apache.hadoop.http.HttpServer2.bindForSinglePort(HttpServer2.java:1236)
        at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:1299)
        at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1154)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:181)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:885)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:707)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:953)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:926)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1692)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1759)
Caused by: java.nio.channels.UnresolvedAddressException
        at sun.nio.ch.Net.checkAddress(Net.java:101)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        ... 13 more
2019-12-24 21:58:49,001 INFO util.ExitUtil: Exiting with status 1: java.net.SocketException: Unresolved address
2019-12-24 21:58:49,005 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at localhost/127.0.0.1
************************************************************/

说是地址未解析,就是不知道hadoop101是什么ip地址,所以在服务器上加一下解析
vim /etc/hosts将hadoop101添加到127.0.0.1上
重启/etc/init.d/network restart` 再次运行sbin/hdfs namenode``成功没报错 jps发现namenode还是没启动(不知道这个命令为啥起不来)
再次运行sbin/hadoop-daemon.sh start namenodenamenode启动成功

网页访问hadoop101:50070,失败
vim etc/hadoop/hdfs-site.xmlweb地址 修改为

<property>
		<name>dfs.namenode.http-address</name>
		<value>hadoop101:50070</value>  <!-- 不行可以写=试试0.0.0.0:50070  任意节点,可能是其他地方节点没配好 ->
</property>

重启 再次访问 成功

同样 要yarn的网址hadoop101:8088也要在yarn-size.xml中添加

<property>
		<name>yarn.resourcemanager.webapp.address</name>
		<value>hadoop101:8088</value>
</property>

最后浏览器访问 http://hadoop101:50070 成功进入到hadoop主页面

  • 2
    点赞
  • 16
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值