spark-submit 报这样的错误
WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
错误原因是内存缺少资源
我的虚拟机内存只有1g,所以spark-env.sh配置
export SPARK_WORKER_MEMORY=800m
但是在我执行spark-submit的时候没有加 --executor-memory参数,spark-submit默认提交任务的时候是1024m,如下图
就出现了上面的错误
解决办法是 spark-submit 后面加--executor-memory 512m就运行正常了