使用代码:
val file = spark.sparkContext.textFile("file:///home/iie4bu/data/hello.txt")
val wordCounts = file.flatMap(line => line.split(",")).map((word => (word,1))).reduceByKey(_ + _)
wordCounts.collect
可以看到有两个Worker在运行这个作业
这个Job:
使用代码:
val file = spark.sparkContext.textFile("file:///home/iie4bu/data/hello.txt")
val wordCounts = file.flatMap(line => line.split(",")).map((word => (word,1))).reduceByKey(_ + _)
wordCounts.collect
可以看到有两个Worker在运行这个作业
这个Job: