java.lang.IllegalArgumentException: System memory 100663296 must be at least 4.718592E8. Please use a larger heap size.
在Eclipse里开发Spark项目,尝试直接在spark里运行程序的时候,遇到下面这个报错:
很明显,这是JVM申请的memory不够导致无法启动SparkContext。但是该怎么设呢?
但是检查了一下启动脚本
#!/bin/bash
/usr/local/spark-1.6.0/bin/spark-submit \
--class cn.spark.study.Opt17_WordCount \
--num-executors 3 \
--driver-memory <strong>100m </strong>\
--executor-memory <strong>100m </strong>\
--executor-cores 3 \
/root/sparkstudy/Java/spark-study-java-0.0.1-SNAPSHOT-jar-with-dependencies.jar \
--master spark://yun01:7077
<pre code_snippet_id="1674698" snippet_file_name="blog_20160506_2_7762008" name="code" class="java" style="color: rgb(51, 51, 51); font-size: 14px; line-height: 26px;">
看来是driver内存不足,当给了 driver的内存尝试着增大到400M 时候
仍旧是爆出如下错
Exception in thread "main" java.lang.IllegalArgumentException: System memory 402128896 must be at least 4.718592E8. Please use a larger heap size.
此时就可以再次调大一些 给了1g(应该是从spark升级1.5或者1.6之后才出现这样的问题,)
然后再次运行之后正常得出结果
还可以指定在代码中 :
val conf = new SparkConf().setAppName("word count")
conf.set("spark.testing.memory", "1g")//后面的值大于512m即可