Spark报错处理系列之:Caused by: java.lang.IllegalArgumentException: Required executor memory 18384 MB, offHeap memory 0 MB, overhead 1838 MB, and PySpark memory 0 MB is above the max threshold 12288 MB of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.
一、完整报错
- Caused by: java.lang.IllegalArgumentException: Required executor memory (18384 MB), offHeap memory (0) MB, overhead (1838 MB), and PySpark memory (0 MB) is above the max threshold (12288 MB) of this cluster! Please check the values of ‘yarn.scheduler.maximum-allocation-mb’ and/or ‘yarn.nodemanager.resource.memory-mb’.
at org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:362)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:193)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:60)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:201)
at org.apache.spark.SparkContext.(SparkContext.scala:562)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2589)
…
二、错误原因
- 这个错误表明您的作业所需的执行器内存超过了集群的最大阈值。错误消息中提到,所需的执行