1.错误描述:
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:75)
recommend_util$.<init>(recommend_util.scala:10)
recommend_util$.<clinit>(recommend_util.scala)
recommend_demo1$.main(recommend_demo1.scala:11)
recommend_demo1.main(recommend_demo1.scala)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2456)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2452)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2452)
at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2554)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:2408)
at recommend_demo1$.main(recommend_demo1.scala:15)
at recommend_demo1.main(recommend_demo1.scala)
解决办法:
sparkConf.set("spark.driver.allowMultipleContexts","true");