用sparkRDD进行分组排序使用groupbykey+ flatmap + zipWithIndex
val conf = new SparkConf().setAppName(“name”).setMaster(“local[2]”)val context = new SparkContext(conf)//context.makeRDD(List[])val ssh = List((“ma”,3),(“ma”,4),(“ma”,5),(“mb”,2),(“mb”,5))val unit...
原创
2019-02-26 13:50:49 ·
1403 阅读 ·
0 评论