Container killed by YARN for exceeding memory limits

SparkOnYarn资源不足
本以为申请的资源够了呢,没想到任务跑了3个小时就死掉了
去Yarn上看日志
看到下面这个玩意儿

20/03/12 11:33:03 ERROR YarnClusterScheduler: Lost executor 2 on node-ana-corePKeW: Container killed by YARN for exceeding memory limits. 1.5 GB of 1.5 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
20/03/12 11:33:05 ERROR YarnClusterScheduler: Lost executor 3 on node-ana-coreQELy: Container killed by YARN for exceeding memory limits. 1.5 GB of 1.5 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
20/03/12 11:33:08 ERROR YarnClusterScheduler: Lost executor 1 on node-ana-coreqdOz: Container killed by YARN for exceeding memory limits. 1.5 GB of 1.5 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.

只能重新部署jar并增加内存了,增加内存后就OK了,如果在看到memeryOvererhead那就是内存不够使了啊。增加SparkJob任务内存

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值