报错信息:
Exception in thread "main" org.apache.spark.SparkException:
Only one SparkContext may be running in this JVM (see SPARK-2243).
To ignore this error, set spark.driver.allowMultipleContexts = true.
The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext
原因:
出现这个问题的原因就是你创建了多个 SparkContext
,就像下面这种用法,只需要干掉J avaSparkContext
就可:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”).setMaster(“local[4]”);
JavaSparkContext jsc = new JavaSparkContext(conf);
JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));
解决这个问题两种方式:
方式1:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”) .setMaster(“local[4]”);
JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));
方式2:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”).setMaster(“local[4]”);
JavaSparkContext jsc = new JavaSparkContext(conf);
JavaStreamingContext stream = new JavaStreamingContext(jsc, Durations.seconds(10));