spark 终止 运行_停止正在运行的Spark应用程序

在独立模式下运行Spark集群时,通过--deploy-mode cluster --supervise选项提交了一个Spark应用以实现容错。现在需要保持集群运行,但要停止应用程序。尝试过停止并重启集群、杀死DriverWrapper守护进程以及删除临时文件,但应用总会恢复运行。问题是如何在不改变代码的情况下停止应用,或者提供其他方法来停止应用并保持集群运行。解决方案是使用./bin/spark-class org.apache.spark.deploy.Client kill命令,可以在Master web UI找到driver ID。
摘要由CSDN通过智能技术生成

I'm running a Spark cluster in standalone mode.

I've submitted a Spark application in cluster mode using options:

--deploy-mode cluster –supervise

So that the job is fault tolerant.

Now I need to keep the cluster running but stop the application from running.

Things I have tried:

Stopping the cluster and restarting it. But the application resumes

execution when I do that.

Used Kill -9 of a daemon named DriverWrapper but the job resumes again after that.

I've also removed temporary files and directories and restarted the cluster but the job resumes again.

So the running application is really fault tolerant.

Question:

Based on the above scenario can someone suggest how I can stop the job from running or what else I can try to stop the application from running but keep the cluster running.</

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值