py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.IllegalArgumentException: Required executor memory (1024), overhead (384 MB), and PySpark memory (0 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.
在修改yarn.scheduler.maximum-allocation-mb和yarn.nodemanager.resource.memory-mb至更大内存
![](https://img-blog.csdnimg.cn/img_convert/3b61e669d5f7c67237271bc8b14911c6.png)
![](https://img-blog.csdnimg.cn/img_convert/22bf0715098f22597eeba5c8fe6a37fd.png)
当然也可以在满足需求前提下通过--executor-memory指定更小内存