在用IntelliJ Idea 运行Spark程序的时候,发现idea sbt java.lang.NoClassDefFoundError: org/apache/spark/SparkConf 错误
解决办法:
build.sbt文件中的
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.1" % "provide"
改成
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.1" % "compile"
感谢网友黑眼圈@~@的提醒,改完后刷新一下,不然不生效