SparkStreaming
throws-Exception
这个作者很懒,什么都没留下…
展开
-
SparkStreaming——SparkStreaming集成Kafka 读写kafka topic
SparkStreaming读Kafka:无状态流处理:object MyReadKafkaHandler { def main(args: Array[String]): Unit = { val conf = new SparkConf().setAppName("mytest").setMaster("local[2]") val sc = SparkContext.getOrCreate(conf)// 流处理的上下文类 val ssc = new Stre原创 2020-08-28 11:02:49 · 282 阅读 · 0 评论 -
SparkStreaming集成Kafka和Flume、SparkStreaming window使用方法
1、Spark Streaming整合Flume(1)pushmaven: <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-flume_2.11</artifactId> <version>2.3.4</version> </dependency>imp原创 2020-08-20 20:17:25 · 202 阅读 · 0 评论 -
SparkStreaming整合kafka——黑名单过滤
import org.apache.kafka.clients.consumer.ConsumerConfigimport org.apache.kafka.common.serialization.StringDeserializerimport org.apache.spark.streaming.kafka010.{ConsumerStrategies, KafkaUtils, LocationStrategies}import org.apache.spark.streaming.{Secon原创 2020-08-20 20:11:48 · 849 阅读 · 0 评论 -
SparkStreaming——SparkStreaming读写Kafka
SparkStreaming读Kafka:无状态流处理:object MyReadKafkaHandler { def main(args: Array[String]): Unit = { val conf = new SparkConf().setAppName("mytest").setMaster("local[2]") val sc = SparkContext.getOrCreate(conf)// 流处理的上下文类 val ssc = new Stre原创 2020-08-14 09:51:11 · 475 阅读 · 0 评论