![](https://img-blog.csdnimg.cn/20201014180756927.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
Spark
01-01
这个作者很懒,什么都没留下…
展开
-
Spark中aggregateByKey实现reduceByKey与groupByKey
Spark中aggregateByKey实现reduceByKey与groupByKey,下面为具体函数/** * aggregateByKey算子来自定义reduceByKey算子 */ def aggregateByKeyToReduceBy(sc:SparkContext): Unit ={ val rdd = sc.parallelize(List(("a",1),("b",2),("c",3),("a",2),("a",3),("b",3),("c",2)))原创 2021-11-09 20:11:48 · 1806 阅读 · 0 评论 -
Spark中combinByKey实现reduceByKey和groupByKey
Spark中combinByKey实现reduceByKey和groupByKey具体的实现函数/** * combinByKey来实现ReduceByKey * @param sc */ def combineByKeyToReduceBy(sc:SparkContext): Unit ={ val rdd = sc.parallelize(List(("a",1),("b",2),("c",3),("a",2),("a",3),("b",3),("c",2)))..原创 2021-11-09 20:10:13 · 1671 阅读 · 0 评论