统计E盘下helloSpark.txt文件中每个单词的出现次数
一、测试代码:
import org.apache.spark.{SparkContext, SparkConf}
object spamm {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("wordcount")
val sc = new SparkContext(conf)
val input = sc.textFile("E:/helloSpark.txt")
val lines = input.flatMap(line => line.split(" "))
val count = lines.map(word => (word, 1)).reduceByKey { case (x, y) => x + y }
val output = count.saveAsTextFile("E:/helloSparkRes")
}
}
二、修改master
因为使用的是spark本地模式,因此需要设置:-Dspark.master=local,详情可参考https://blog.csdn.net/shenlanzifa/article/details/42679577