java.net.BindException: 无法指定被请求的地址

一、java.net.BindException: 无法指定被请求的地址

单机版启动报错:

2019-02-21 09:46:24 WARN Utils:66 - Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
2019-02-21 09:46:24 WARN Utils:66 - Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
2019-02-21 09:46:24 WARN Utils:66 - Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
2019-02-21 09:46:24 WARN Utils:66 - Service ‘sparkDriver’ could not bind on a random free port. You may check whether configuring an appropriate binding address.
2019-02-21 09:46:24 ERROR SparkContext:91 - Error initializing SparkContext.
java.net.BindException: 无法指定被请求的地址: Service ‘sparkDriver’ failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service ‘sparkDriver’ (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
at io.netty.channel.AbstractChannelAbstractUnsafe.bind(AbstractChannel.java:558)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
在这里插入图片描述
解决办法是在配置文件中添加
export SPARK_MASTER_IP=localhost
export SPARK_LOCAL_IP=localhost
执行命令:source /etc/profile,让环境配置生效
再次启动,就可以了
我是使用SparkPi来计算Pi的值。执行命令:run-example SparkPi 2 (其中参数2是指两个并行度)
在这里插入图片描述

二、启动Hadoop集群时,发现没有做免密登录

解决办法,重新做一个免密登录
( 1)使用ssh-keygen产生公钥与私钥对。
输入命令“ssh-keygen -t rsa”,接着按三次“Enter”键,生成私有密钥id_rsa和公有密钥id rsa.pub两个文件。ssh-keygen用来生成RSA类型的密钥以及管理该密钥,参数“-t”用于指定要创建的SSH密钥的类型为RSA,如图
在这里插入图片描述
( 2)用ssh-copy-id将公钥复制到远程机器中,执行命令
ssh-copy-id -i /root/.ssh/id_rsa.pub localhost//依次输入yes,p@ssw0rd(root用户的密码)
ssh-copy-id -i /root/.ssh/id_rsa.pub slave1 //同上
ssh-copy-id -i /root/.ssh/id_rsa.pub slave2 //同上
( 3)验证SSH是否能够无密钥登录。
在localhost下分别输入 ssh slave1 , ssh slave2 ,如图所示,说明配置SSH免密码登录成功。
在这里插入图片描述
然后在启动就不会出现让输密码这样的问题了

三、启动集群时报错、

在这里插入图片描述
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
failed to launch: nice -n 0 /usr/local/spark-3.3.2-bin-hadoop3/bin/spark-class org.apache.spark.deploy.master.Master --host master --port 7077 --webui-port 8080
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:469)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:503)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
23/05/15 11:25:06 INFO ShutdownHookManager: Shutdown hook called
full log in /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
localhost: failed to launch: nice -n 0 /usr/local/spark-3.3.2-bin-hadoop3/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077
localhost: at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260)
localhost: at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
localhost: at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
localhost: at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:469)
localhost: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:503)
localhost: at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
localhost: at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
localhost: at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
localhost: at java.lang.Thread.run(Thread.java:750)
localhost: 23/05/15 11:25:09 INFO ShutdownHookManager: Shutdown hook called
localhost: full log in /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
[root@master ~]# ll /conf
ls: 无法访问/conf: 没有那个文件或目录
[root@master ~]# ll conf
ls: 无法访问conf: 没有那个文件或目录
[root@master ~]# cd /conf
-bash: cd: /conf: 没有那个文件或目录
[root@master ~]# ll /usr/local/spark-3.3.2-bin-hadoop3/conf
总用量 36
-rw-r–r–. 1 501 1000 1105 2月 11 04:40 fairscheduler.xml.template
-rw-r–r–. 1 501 1000 3350 2月 11 04:40 log4j2.properties.template
-rw-r–r–. 1 501 1000 9141 2月 11 04:40 metrics.properties.template
-rw-r–r–. 1 501 1000 1292 2月 11 04:40 spark-defaults.conf.template
-rwxr-xr-x. 1 501 1000 4506 2月 11 04:40 spark-env.sh.template
-rw-r–r–. 1 501 1000 865 2月 11 04:40 workers.template
[root@master ~]# vim spark-env.sh.template

[1]+ 已停止 vim spark-env.sh.template
[root@master ~]# vim spark-env.sh

[2]+ 已停止 vim spark-env.sh
[root@master ~]# cd /usr/local/spark-3.3.2-bin-hadoop3/conf
[root@master conf]# vim spark-env.sh.template
[root@master conf]# start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
failed to launch: nice -n 0 /usr/local/spark-3.3.2-bin-hadoop3/bin/spark-class org.apache.spark.deploy.master.Master --host master --port 7077 --webui-port 8080
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:469)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:503)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
23/05/15 11:35:40 INFO ShutdownHookManager: Shutdown hook called
full log in /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local[root@master ~]# start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
failed to launch: nice -n 0 /usr/local/spark-3.3.2-bin-hadoop3/bin/spark-class org.apache.spark.deploy.master.Master --host master --port 7077 --webui-port 8080
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:469)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:503)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
23/05/15 11:37:50 INFO ShutdownHookManager: Shutdown hook called
full log in /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
localhost: failed to launch: nice -n 0 /usr/local/spark-3.3.2-bin-hadoop3/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077
localhost: at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260)
localhost: at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356)
localhost: at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
localhost: at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:469)
localhost: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:503)
localhost: at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
localhost: at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
localhost: at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
localhost: at java.lang.Thread.run(Thread.java:750)
localhost: 23/05/15 11:37:53 INFO ShutdownHookManager: Shutdown hook called
localhost: full log in /usr/local/spark-3.3.2-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
具体原因我还没找到,我尝试了修改配置文件还要主机文件,还是没有找到他怎么解决,要是哪位大神有解决方法滴我一下,跪谢

  • 0
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值