1、实际应用
val sc = new SparkContext(new SparkConf().
setAppName("product3_source").
set("spark.serializer", "org.apache.spark.serializer.KryoSerializer").
set("spark.memory.useLegacyMode", "true").
set("spark.storage.memoryFraction","0.2").
set("spark.shuffle.memoryFraction","0.7")
)
2、实际内存计算
spark.storage.safetyFraction 来指定安全区比例,默认值为 0.9
spark.storage.memoryFraction 指定,默认值为 0.6
0.9× 0.6 = 0.54
Spark.Shuffle.safetyFraction 来指定安全区 比例,该参数默认值是 0.8
Spark.Shuffle.memoryFraction 指定,默认值为 0.2 ,
0.8 × 0.2 = 0.16
3、spark.yarn.executor.memoryOverhead
executor执行的时候,用的内存可能会超过executor-memoy,所以会为executor额外预留一部分内存。spark.yarn.executor.memoryOverhead代表了这部分内存。
所以
executorMem=executorMemory +spark.yarn.executor.memoryOverhead