java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed

记一次Spark Local 模式下的错误:

报错日记:

[root@steve-lidiliang /user/coding/Conclusions/JavaBasic]# spark-shell 
2018-05-02 13:51:20 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 WARN  Utils:66 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
2018-05-02 13:51:28 ERROR SparkContext:91 - Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:433)
	at sun.nio.ch.Net.bind(Net.java:425)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
	at java.lang.Thread.run(Thread.java:748)
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
  at sun.nio.ch.Net.bind0(Native Method)
  at sun.nio.ch.Net.bind(Net.java:433)
  at sun.nio.ch.Net.bind(Net.java:425)
  at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
  at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
  at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
  at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
  at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
  at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
  at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
  at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
  at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
  at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
  at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
  at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
  at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
  at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
  at java.lang.Thread.run(Thread.java:748)
<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.3.0
      /_/
         
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_171)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 
/etc/hosts

[root@steve-lidiliang /user/coding/Conclusions/JavaBasic]# cat /etc/hosts 
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
151.151.151.151 steve-lidiliang.novalocal


ifconfig

eth0: flags=4163<UP,BROADCAST,RUNNING,MULTICAST>  mtu 1500
        inet 172.16.0.228  netmask 255.255.255.0  broadcast 172.16.0.255
        inet6 fe80::f816:3eff:fe82:d364  prefixlen 64  scopeid 0x20<link>
        ether fa:16:3e:82:d3:64  txqueuelen 1000  (Ethernet)
        RX packets 728088  bytes 1864983863 (1.7 GiB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 644257  bytes 76067893 (72.5 MiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

lo: flags=73<UP,LOOPBACK,RUNNING>  mtu 65536
        inet 127.0.0.1  netmask 255.0.0.0
        inet6 ::1  prefixlen 128  scopeid 0x10<host>
        loop  txqueuelen 1  (Local Loopback)
        RX packets 199770  bytes 12404897 (11.8 MiB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 199770  bytes 12404897 (11.8 MiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

解决办法:

把/etc/hosts 改成如下内容:

[root@steve-lidiliang /user/coding/Conclusions/JavaBasic]# cat /etc/hosts 
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
172.16.0.228  steve-lidiliang.novalocal

发布了165 篇原创文章 · 获赞 55 · 访问量 12万+
展开阅读全文

项目启动不了,Cannot assign requested address: bind

12-17

严重: Failed to start end point associated with ProtocolHandler ["http-nio-123.129.254.12-8080"] java.net.BindException: Cannot assign requested address: bind at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.apache.tomcat.util.net.NioEndpoint.bind(NioEndpoint.java:210) at org.apache.tomcat.util.net.AbstractEndpoint.start(AbstractEndpoint.java:978) at org.apache.coyote.AbstractProtocol.start(AbstractProtocol.java:628) at org.apache.catalina.connector.Connector.startInternal(Connector.java:1022) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) at org.apache.catalina.core.StandardService.addConnector(StandardService.java:225) at org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContainer.addPreviouslyRemovedConnectors(TomcatEmbeddedServletContainer.java:250) at org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContainer.start(TomcatEmbeddedServletContainer.java:193) at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.startEmbeddedServletContainer(EmbeddedWebApplicationContext.java:297) at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.finishRefresh(EmbeddedWebApplicationContext.java:145) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:546) at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:122) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:693) at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:360) at org.springframework.boot.SpringApplication.run(SpringApplication.java:303) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1118) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1107) at com.DemoApplication.main(DemoApplication.java:10) 问答

hadoop集群搭建:Cannot assign requested address

09-05

问题: 2016-09-05 16:40:03,320 FATAL org.apache.hadoop.hdfs.server.namenode.NameNode: Exception in namenode join java.net.BindException: Problem binding to [master:9000] java.net.BindException: Cannot assign requested address; For more details see: http://wiki.apache.org/hadoop/BindException at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:719) at org.apache.hadoop.ipc.Server.bind(Server.java:419) at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:561) at org.apache.hadoop.ipc.Server.<init>(Server.java:2166) at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:897) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server.<init>(ProtobufRpcEngine.java:505) at org.apache.hadoop.ipc.ProtobufRpcEngine.getServer(ProtobufRpcEngine.java:480) at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:742) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.<init>(NameNodeRpcServer.java:311) at org.apache.hadoop.hdfs.server.namenode.NameNode.createRpcServer(NameNode.java:614) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:587) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:751) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:735) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1407) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1473) -------------------------------------------------------------------------------------------------- 采用的是云服务搭建的集群,用的是公网地址。 ifconfig后出现: eth0 Link encap:Ethernet HWaddr 52:54:00:C6:39:03 inet addr:10.141.88.213 Bcast:10.141.127.255 Mask:255.255.192.0 但我的hosts是: 127.0.0.1 localhost localhost.localdomain 123.207.161.186 master 123.207.145.16 slave1 123.207.140.169 slave2 如果配置私网地址,主机之间ping不通。 问答

没有更多推荐了,返回首页

©️2019 CSDN 皮肤主题: 精致技术 设计师: CSDN官方博客

分享到微信朋友圈

×

扫一扫,手机浏览