Spark中executor-memory参数详解
https://blog.csdn.net/wisgood/article/details/77857039#commentsedit
hadoop - Yarn - why doesn't task go out of heap space but container gets killed? - Stack Overflow
Container killed by YARN for exceeding memory limits. 52.6 GB of 50 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead - Stack Overflow
amazon web services - Boosting spark.yarn.executor.memoryOverhead - Stack Overflow
https://stackoverflow.com/questions/38101857/boosting-spark-yarn-executor-memoryoverhead