spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client $SPARK_HOME/examples/jars/spark-examples_2.11-2.2.0.jar 100
出现的错误如下:
ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
在hadoop ui上查看任务详情
2.2g的虚拟内存实际值,超过了2.1g的上限,所以contrainer被杀死了
解决:在yarn-site.xml中加入配置
<property> <name>yarn.nodemanager.vmem-check-enabled</name> <value>false</value> </property>重启Hadoop和spark重新提交