SparkOnYarn资源不足
本以为申请的资源够了呢,没想到任务跑了3个小时就死掉了
去Yarn上看日志
看到下面这个玩意儿
20/03/12 11:33:03 ERROR YarnClusterScheduler: Lost executor 2 on node-ana-corePKeW: Container killed by YARN for exceeding memory limits. 1.5 GB of 1.5 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
20/03/12 11:33:05 ERROR YarnClusterScheduler: Lost executor 3 on node-ana-coreQELy: Container killed by YARN for exceeding memory limits. 1.5 GB of 1.5 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
20/03/12 11:33:08 ERROR YarnClusterScheduler: Lost executor 1 on node-ana-coreqdOz: Container killed by YARN for exceeding memory limits. 1.5 GB of 1.5 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
只能重新部署jar并增加内存了,增加内存后就OK了,如果在看到memeryOvererhead那就是内存不够使了啊。增加SparkJob任务内存