报错
Error:(6, 12) object apache is not a member of package org
import org.apache.spark.SparkContext._^
Error:(7, 12) object apache is not a member of package org
import org.apache.spark.{SparkConf, SparkContext}
^
Error:(16, 20) not found: type SparkConf
val conf = new SparkConf().setAppName("HelloSpark")
^
Error:(17, 18) not found: type SparkContext
val sc = new SparkContext(conf)
^
解决:
右键工程-open module setting-Libraries-spark-assembly-1.0.0-hadoop1.0.4.jar(添加spark的jar包-保存