场景:Spark to yarn启动的时候报错
原因:虚拟内存不够
解决方案:
可通过以下两种方式解决:
调大虚拟内存比:
yarn.nodemanager.vmem-pmem-ratio 默认2.1倍
或者关闭虚拟内存检查:
yarn.nodemanager.vmem-check-enabled 默认true
这是犯错的场景:
输入命令如下:
./spark-submit --master yarn --class org.apache.spark.examples.SparkPi ../examples/jars/spark-examples_2.11-2.2.1.jar 1
报错如下:
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
打开hadoop的web端