Spark学习专栏
1610797071
大数据开发学习者
展开
-
Spark算子之AggregateByKey使用及流程解析
Spark算子之AggregateByKey使用及流程解析package com.bigdata.spark.core.rdd.oper.transformimport org.apache.spark.{SparkConf, SparkContext}object RDD_Oper_Transform_1 { def main(args: Array[String]): Unit = { val conf = new SparkConf().setMaster("lo原创 2022-05-07 10:47:42 · 679 阅读 · 0 评论 -
SparkCore算子之CombineByKey使用
*SparkCore算子之CombineByKey使用package com.bigdata.spark.core.rdd.oper.transformimport org.apache.spark.{SparkConf, SparkContext}object RDD_Oper_Transform {def main(args: Array[String]): Unit = { val conf = new SparkConf().setMaster("local[*]").set...原创 2022-05-07 10:31:24 · 675 阅读 · 0 评论