- executor-cores:每个executor使用的内核数,默认是1,官方建议2-5个,我们企业是4个
- num-executors:启动executors的数量,默认为2;
- executor-memory:executor内存大小,默认是1G;
- driver-cores:driver使用内核数,默认是1;
- driver-memory:driver内存大小,默认是512M
提交任务的样式:
spark-submit \
--master local[5] \
--driver-cores 2 \
--driver-memory 8g \
--executor-cores 4 \
--num-executors 10 \
--executor-memory 8g \
--class PackageName.ClassName XXXX.jar \
--name "Spark Job Name" \
InputPath \
OutputPath
注: executor_cores*num_executors:表示的是能够并行执行Task的数目
参考: https://blog.csdn.net/weixin_48077303/article/details/123243056