Spark stop all 失败

spark stop all 不成功,提示如下

[root@sm61 sbin]# ./stop-all.sh
s92.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s95.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s76.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s93.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s74.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s78.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s71.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s73.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s72.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s99.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s75.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s94.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s77.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s96.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s97.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s98.spark.starv.com: no org.apache.spark.deploy.worker.Worker to stop
s91.spark.starv.com: ssh: connect to host s91.spark.starv.com port 22: No route to host
no org.apache.spark.deploy.master.Master to stop

因为没有 找到进程的 ID ,所以 无法stop, 那么 进程的 ID 存储在 哪呢 ?

spark-daemon.sh 里有这样一句话:

if [ "$SPARK_PID_DIR" = "" ]; then
  SPARK_PID_DIR=/tmp
fi


如果没有 配置$SPARK_PID_DIR这个环境变量,那么就在/tmp里面,但是因为系统会定期清空/tmp目录。

所以在 conf/spark-evn.sh 配置了 这个 环境变量。然后重启。

 

 

转载于:https://www.cnblogs.com/soupwater/p/7159064.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值