Address already in use: Service 'SparkUI' failed after 16 retries (starting from 4040)!

Address already in use: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consi
der explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an a
vailable port or increasing spark.port.maxRetries.

saprk-sql 启动时报错 (yarn client 模式)

描述:每一个Spark任务都会占用一个SparkUI端口,默认为4040,如果被占用则依次递增端口重试。但是有个默认重试次数,为16次。16次重试都失败后,会放弃该任务的运行。

解决

初始化SparkConf时,添加conf.set(“spark.port.maxRetries”,“100”)语句

使用spark-submit提交任务时,在命令行中添加-Dspark.port.maxRetries=100

在spark-defaults.conf中添加spark.port.maxRetries 100
 

20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4045. Attempting port 4046.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4046. Attempting port 4047.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4047. Attempting port 4048.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4048. Attempting port 4049.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4049. Attempting port 4050.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4050. Attempting port 4051.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4051. Attempting port 4052.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4052. Attempting port 4053.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4053. Attempting port 4054.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4054. Attempting port 4055.
20/03/24 19:10:59 WARN util.Utils: Service 'SparkUI' could not bind on port 4055. Attempting port 4056.
20/03/24 19:10:59 ERROR ui.SparkUI: Failed to bind SparkUI
java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consi
der explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an a
vailable port or increasing spark.port.maxRetries.
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:433)
	at sun.nio.ch.Net.bind(Net.java:425)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:317)
	at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
	at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235)
	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newConnector$1(JettyUtils.scala:340)
	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpConnect$1(JettyUtils.scala:367)
	at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:370)
	at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:370)
	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2235)
	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2227)
	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:370)
	at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
	at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:463)
	at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:463)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:463)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2490)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910)
	at XXXXX.job.InOutChannelJob$.main(InOutChannelJob.scala:33)
	at XXXXX.job.InOutChannelJob.main(InOutChannelJob.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
20/03/24 19:10:59 INFO storage.DiskBlockManager: Shutdown hook called
20/03/24 19:10:59 INFO util.ShutdownHookManager: Shutdown hook called
20/03/24 19:10:59 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-a6f61cd7-12dd-48dd-a3b0-12388b0805e0/u
serFiles-40e72389-5bcd-4079-a00c-dc61dca2c9d5
20/03/24 19:10:59 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-a6f61cd7-12dd-48dd-a3b0-12388b0805e0

 

  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值