端口多次重试失败异常

11-08-2020 11:38:48 CST t_uac_user INFO - Starting job t_uac_user at 1597117128685
11-08-2020 11:38:48 CST t_uac_user INFO - job JVM args: '-Dazkaban.flowid=order' '-Dazkaban.execid=130' '-Dazkaban.jobid=t_uac_user'
11-08-2020 11:38:48 CST t_uac_user INFO - user.to.proxy property was not set, defaulting to submit user azkaban
11-08-2020 11:38:48 CST t_uac_user INFO - Building command job executor. 
11-08-2020 11:38:48 CST t_uac_user INFO - Memory granted for job t_uac_user
11-08-2020 11:38:48 CST t_uac_user INFO - 1 commands to execute.
11-08-2020 11:38:48 CST t_uac_user INFO - cwd=/opt/azkaban/azkaban-3.81.0/azkaban-exec-server/build/install/azkaban-exec-server/executions/130
11-08-2020 11:38:48 CST t_uac_user INFO - effective user is: azkaban
11-08-2020 11:38:48 CST t_uac_user INFO - Command: ssh mz-hadoop-01 "source /etc/profile;/mnt/db_file/wfs/paascloud/t_uac_user.sh  "
11-08-2020 11:38:48 CST t_uac_user INFO - Environment variables: {JOB_OUTPUT_PROP_FILE=/opt/azkaban/azkaban-3.81.0/azkaban-exec-server/build/install/azkaban-exec-server/executions/130/t_uac_user_output_8692020296885557590_tmp, JOB_PROP_FILE=/opt/azkaban/azkaban-3.81.0/azkaban-exec-server/build/install/azkaban-exec-server/executions/130/t_uac_user_props_3322224465573219858_tmp, KRB5CCNAME=/tmp/krb5cc__order__order__t_uac_user__130__azkaban, JOB_NAME=t_uac_user}
11-08-2020 11:38:48 CST t_uac_user INFO - Working directory: /opt/azkaban/azkaban-3.81.0/azkaban-exec-server/build/install/azkaban-exec-server/executions/130
11-08-2020 11:38:48 CST t_uac_user INFO - Spawned process with id 22057
11-08-2020 11:38:48 CST t_uac_user INFO - 起始日期:2020-08-10,结束日期:2020-08-10
11-08-2020 11:38:51 CST t_uac_user INFO - 20/08/11 11:38:51 INFO spark.SparkContext: Running Spark version 2.4.0.cloudera2
11-08-2020 11:38:51 CST t_uac_user INFO - 20/08/11 11:38:51 INFO spark.SparkContext: Submitted application: t_uac_user
11-08-2020 11:38:51 CST t_uac_user INFO - 20/08/11 11:38:51 INFO spark.SecurityManager: Changing view acls to: root
11-08-2020 11:38:51 CST t_uac_user INFO - 20/08/11 11:38:51 INFO spark.SecurityManager: Changing modify acls to: root
11-08-2020 11:38:51 CST t_uac_user INFO - 20/08/11 11:38:51 INFO spark.SecurityManager: Changing view acls groups to: 
11-08-2020 11:38:51 CST t_uac_user INFO - 20/08/11 11:38:51 INFO spark.SecurityManager: Changing modify acls groups to: 
11-08-2020 11:38:51 CST t_uac_user INFO - 20/08/11 11:38:51 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO util.Utils: max retries is 16
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO util.Utils: Successfully started service 'sparkDriver' on port 37149.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO spark.SparkEnv: Registering MapOutputTracker
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO spark.SparkEnv: Registering BlockManagerMaster
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-031bffb6-fbb3-4aaa-bfb3-f6213e01d7f4
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO spark.SparkEnv: Registering OutputCommitCoordinator
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO util.log: Logging initialized @3278ms
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO server.Server: Started @3449ms
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO util.Utils: max retries is 16
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4045. Attempting port 4046.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4046. Attempting port 4047.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4047. Attempting port 4048.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4048. Attempting port 4049.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4049. Attempting port 4050.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4050. Attempting port 4051.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4051. Attempting port 4052.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4052. Attempting port 4053.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4053. Attempting port 4054.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4054. Attempting port 4055.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 WARN util.Utils: Service 'SparkUI' could not bind on port 4055. Attempting port 4056.
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 ERROR ui.SparkUI: Failed to bind SparkUI
11-08-2020 11:38:52 CST t_uac_user INFO - java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
11-08-2020 11:38:52 CST t_uac_user INFO - 	at sun.nio.ch.Net.bind0(Native Method)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at sun.nio.ch.Net.bind(Net.java:433)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at sun.nio.ch.Net.bind(Net.java:425)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.spark_project.jetty.server.ServerConnector.openAcceptChannel(ServerConnector.java:351)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:319)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newConnector$1(JettyUtils.scala:353)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpConnect$1(JettyUtils.scala:382)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:385)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:385)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2315)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2307)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:385)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.ui.WebUI.bind(WebUI.scala:132)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:452)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:452)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at scala.Option.foreach(Option.scala:257)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at scala.Option.getOrElse(Option.scala:121)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at com.mingzhi.universal.load_all_from_mysql$.load(load_all_from_mysql.scala:73)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at com.mingzhi.universal.load_all_from_mysql$.main(load_all_from_mysql.scala:24)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at com.mingzhi.universal.load_all_from_mysql.main(load_all_from_mysql.scala)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at java.lang.reflect.Method.invoke(Method.java:498)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
11-08-2020 11:38:52 CST t_uac_user INFO - 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO storage.DiskBlockManager: Shutdown hook called
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO util.ShutdownHookManager: Shutdown hook called
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7596b4c8-6950-47e0-a4c3-29042fe6917d/userFiles-5e9d9769-a283-41f1-882a-e53364bea979
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7596b4c8-6950-47e0-a4c3-29042fe6917d
11-08-2020 11:38:52 CST t_uac_user INFO - 20/08/11 11:38:52 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-a0f37fef-d344-420e-91e2-7fa5f37490ee
11-08-2020 11:38:53 CST t_uac_user INFO - Process with id 22057 completed unsuccessfully in 4 seconds.
11-08-2020 11:38:53 CST t_uac_user ERROR - Job run failed!
java.lang.RuntimeException: azkaban.jobExecutor.utils.process.ProcessFailureException: Process exited with code 1
	at azkaban.jobExecutor.ProcessJob.run(ProcessJob.java:305)
	at azkaban.execapp.JobRunner.runJob(JobRunner.java:813)
	at azkaban.execapp.JobRunner.doRun(JobRunner.java:602)
	at azkaban.execapp.JobRunner.run(JobRunner.java:563)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: azkaban.jobExecutor.utils.process.ProcessFailureException: Process exited with code 1
	at azkaban.jobExecutor.utils.process.AzkabanProcess.run(AzkabanProcess.java:125)
	at azkaban.jobExecutor.ProcessJob.run(ProcessJob.java:297)
	... 8 more
11-08-2020 11:38:53 CST t_uac_user ERROR - azkaban.jobExecutor.utils.process.ProcessFailureException: Process exited with code 1 cause: azkaban.jobExecutor.utils.process.ProcessFailureException: Process exited with code 1
11-08-2020 11:38:53 CST t_uac_user INFO - Finishing job t_uac_user at 1597117133335 with status FAILED

 解决方案:

Address already in use: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consi
der explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an a
vailable port or increasing spark.port.maxRetries.

saprk-sql 启动时报错 (yarn client 模式)

描述:每一个Spark任务都会占用一个SparkUI端口,默认为4040,如果被占用则依次递增端口重试。但是有个默认重试次数,为16次。16次重试都失败后,会放弃该任务的运行。

解决

初始化SparkConf时,添加conf.set(“spark.port.maxRetries”,“100”)语句

使用spark-submit提交任务时,在命令行中添加
--conf spark.port.maxRetries=128

在spark-defaults.conf中添加spark.port.maxRetries 100
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值