SparkStreaming
wangfutai91
钩深索隐,卓荦为杰。
展开
-
广告黑名单-SparkStreaming
动态过滤,黑名单时刻发生改变,不再是一个固定名单* updatastatabykey* 黑名单中(aa, true) (bb, false) 黑名单动态生成* aa, true 黑名* bb, false 进入黑名单,等待进一步确认***在黑名单 如果外部传递 名字 false 那么就把这个false改成true*cc, false***外部如果传递 dd, true ...原创 2019-03-05 19:01:18 · 239 阅读 · 0 评论 -
sparkstreaming-transform
object Transform { def main(args: Array[String]): Unit = { System.setProperty("hadoop.home.dir", "E:\\software\\bigdate\\hadoop-2.6.0-cdh5.15.0\\hadoop-2.6.0-cdh5.15.0") val conf= new S...原创 2019-03-05 19:03:05 · 351 阅读 · 0 评论 -
sparkstreaming--入门
//单词统计流式处理object WordCountStreaming { def main(args: Array[String]): Unit = { System.setProperty("hadoop.home.dir", "E:\\software\\bigdate\\hadoop-2.6.0-cdh5.15.0\\hadoop-2.6.0-cdh5.15.0") ...原创 2019-03-05 19:04:05 · 108 阅读 · 0 评论 -
sparkstreaming--updateStateByKey
//updateStateByKey状态一直持续,即累加之前所有的RDD的结果object WordCountUpdateStateByKey { def main(args: Array[String]): Unit = { System.setProperty("hadoop.home.dir", "E:\\software\\bigdate\\hadoop-2.6.0-cdh...原创 2019-03-05 19:05:34 · 124 阅读 · 0 评论 -
sparkstreaming--window把多个批次中RDD合并成一个RDD
object WordCountWindows { def main(args: Array[String]): Unit = { System.setProperty("hadoop.home.dir", "E:\\software\\bigdate\\hadoop-2.6.0-cdh5.15.0\\hadoop-2.6.0-cdh5.15.0") val conf= ...原创 2019-03-05 19:06:41 · 2101 阅读 · 0 评论 -
SparkStreaming--输入源(本地文件)
//输入源object WordCountHDFSSource { def main(args: Array[String]): Unit = { System.setProperty("hadoop.home.dir", "E:\\software\\bigdate\\hadoop-2.6.0-cdh5.15.0\\hadoop-2.6.0-cdh5.15.0") val...原创 2019-03-05 19:08:32 · 1061 阅读 · 0 评论 -
SparkStreaming--输入源(卡夫卡kafka)
object WCKafka extends App{ System.setProperty("hadoop.home.dir", "E:\\software\\bigdate\\hadoop-2.6.0-cdh5.15.0\\hadoop-2.6.0-cdh5.15.0") val conf= new SparkConf(); conf.setMaster("local[...原创 2019-03-05 19:09:23 · 263 阅读 · 0 评论 -
SparkStreaming--输出(输出文本 * saveAS&&foreachRDD)
object Output_01 { def main(args: Array[String]): Unit = { System.setProperty("hadoop.home.dir", "E:\\software\\bigdate\\hadoop-2.6.0-cdh5.15.0\\hadoop-2.6.0-cdh5.15.0") val conf= new Sp...原创 2019-03-05 19:10:36 · 396 阅读 · 0 评论