win IDEA debug spark连接远程spark集群执行结果记录

linux spark cluster:
spark-master
spark-worker-1

win开发环境下运行spark
连接到cluster,但是debug模式下,在配置conf下可以成功连接执行,但是感觉不同环境下联合调试不成功,其实是可以成功的,可以打开master查看执行进程和日志就知道了 。结果如下:

"/usr/lib/jvm/java-8-openjdk-amd64//bin/java" \
"-cp" "/mnt/geoSpark/*:/spark/conf/:/spark/jars/*:/etc/hadoop/:/opt/hadoop-3.2.1/share/hadoop/common/lib/*:/opt/hadoop-3.2.1/share/hadoop/common/*:/opt/hadoop-3.2.1/share/hadoop/hdfs/:/opt/hadoop-3.2.1/share/hadoop/hdfs/lib/*:/opt/hadoop-3.2.1/share/hadoop/hdfs/*:/opt/hadoop-3.2.1/share/hadoop/mapreduce/lib/*:/opt/hadoop-3.2.1/share/hadoop/mapreduce/*:/opt/hadoop-3.2.1/share/hadoop/yarn/:/opt/hadoop-3.2.1/share/hadoop/yarn/lib/*:/opt/hadoop-3.2.1/share/hadoop/yarn/*" \
"-Xmx1024M" "-Dspark.driver.port=53110" \ "org.apache.spark.executor.CoarseGrainedExecutorBackend" \
"--driver-url" "spark://CoarseGrainedScheduler@172.16.0.162:53110" \
"--executor-id" "0" \
"--hostname" "192.168.240.12" \
"--cores" "96" \
"--app-id" "app-20220120140458-0039" \
"--worker-url" "spark://Worker@192.168.240.12:42339"

win idea下的spark 作为driver,执行日志

2022-01-20 14:04:58,428 INFO worker.ExecutorRunner: Launch command: "/usr/lib/jvm/java-8-openjdk-amd64//bin/java" "-cp" "/mnt/geoSpark/*:/spark/conf/:/spark/jars/*:/etc/hadoop/:/opt/hadoop-3.2.1/share/hadoop/common/lib/*:/opt/hadoop-3.2.1/share/hadoop/common/*:/opt/hadoop-3.2.1/share/hadoop/hdfs/:/opt/hadoop-3.2.1/share/hadoop/hdfs/lib/*:/opt/hadoop-3.2.1/share/hadoop/hdfs/*:/opt/hadoop-3.2.1/share/hadoop/mapreduce/lib/*:/opt/hadoop-3.2.1/share/hadoop/mapreduce/*:/opt/hadoop-3.2.1/share/hadoop/yarn/:/opt/hadoop-3.2.1/share/hadoop/yarn/lib/*:/opt/hadoop-3.2.1/share/hadoop/yarn/*" "-Xmx1024M" "-Dspark.driver.port=53110" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@172.16.0.162:53110" "--executor-id" "0" "--hostname" "192.168.240.12" "--cores" "96" "--app-id" "app-20220120140458-0039" "--worker-url" "spark://Worker@192.168.240.12:42339"
2022-01-20 14:05:32,318 INFO worker.Worker: Asked to kill executor app-20220120140458-0039/0
2022-01-20 14:05:32,318 INFO worker.ExecutorRunner: Runner thread for executor app-20220120140458-0039/0 interrupted
2022-01-20 14:05:32,319 INFO worker.ExecutorRunner: Killing process!
2022-01-20 14:05:32,755 INFO worker.Worker: Executor app-20220120140458-0039/0 finished with state KILLED exitStatus 143
2022-01-20 14:05:32,756 INFO shuffle.ExternalShuffleBlockResolver: Clean up non-shuffle and non-RDD files associated with the finished executor 0
2022-01-20 14:05:32,756 INFO shuffle.ExternalShuffleBlockResolver: Executor is not registered (appId=app-20220120140458-0039, execId=0)
2022-01-20 14:05:32,756 INFO shuffle.ExternalShuffleBlockResolver: Application app-20220120140458-0039 removed, cleanupLocalDirs = true
2022-01-20 14:05:32,756 INFO worker.Worker: Cleaning up local directories for application app-20220120140458-0039
2022-01-20 14:06:08,352 INFO worker.Worker: Asked to launch executor app-20220120140608-0040/0 for testSpark
2022-01-20 14:06:08,355 INFO spark.SecurityManager: Changing view acls to: root
2022-01-20 14:06:08,355 INFO spark.SecurityManager: Changing modify acls to: root
2022-01-20 14:06:08,355 INFO spark.SecurityManager: Changing view acls groups to: 
2022-01-20 14:06:08,355 INFO spark.SecurityManager: Changing modify acls groups to: 
2022-01-20 14:06:08,355 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()

Worker 作为executor ,其执行日志可以查看logs.
其中spark worker默认的端口是8081,我改成了6081,尽量保持原来默认端口最好
在这里插入图片描述
在这里插入图片描述
TODO:2022.1.23
如何命令行查看日志

参考:
Spark2.x在Idea中运行在远程集群中并进行调试
idea远程调试 spark

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值