今天看了部分Spark的官方文档,spark实战高手之路(4)在IntelliJ中,我尝试运行了SparkPi,整个过程遇到一些问题
首先是当我把相关的包导入好后,Run,报错:
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration at org.apache.spark.SparkContext.<init>(SparkContext.scala:185) 解决办法:在IDE中点击Run -> Edit Configuration,在右侧VM options中输入“-Dspark.master=local”,指示本程序本地单线程运行再次运行,依旧出错:
解决办法:把scala和spark的版本改为一致Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet; at akka.actor.ActorCell$.<init>(ActorCell.scala:336) at akka.actor.ActorCell$.<clinit>(ActorCell.scala) at akka.actor.RootActorPath.$div(ActorPath.scala:159) at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:464) at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:452)