![](https://img-blog.csdnimg.cn/20201014180756927.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
Spark
xhl101711
这个作者很懒,什么都没留下…
展开
-
参数为1个RDD的Cogroup
import org.apache.spark.SparkContext import org.apache.spark.SparkConf object Cogroup { def main(args: Array[String]): Unit = { val sc = new SparkContext("local", "Cogroup", new SparkCon原创 2016-08-17 15:27:50 · 167 阅读 · 0 评论 -
spark kafka hbase
接上一个示例,结果输出到hbase wordCounts.foreachRDD { rdd => rdd.foreachPartition { partitionOfRecords => // ConnectionPool is a static, lazily initialized pool of connections val hbaseCon原创 2016-09-05 14:34:14 · 205 阅读 · 0 评论