SparkStreaming 从Flume Poll数据

1.官网资料

http://spark.apache.org/docs/latest/streaming-flume-integration.html

2.需要下载相关依赖到flumelib

3.配置flume的配置文件

#agent1表示代理名称

agent1.sources=source1

agent1.sinks=sink1

agent1.channels=channel1

 

 

#配置source1

agent1.sources.source1.type=spooldir

agent1.sources.source1.spoolDir=/usr/local/flume/tmp/TestDir

agent1.sources.source1.channels=channel1

agent1.sources.source1.fileHeader = false

agent1.sources.source1.interceptors = i1

agent1.sources.source1.interceptors.i1.type = timestamp

 

#配置sink1

#agent1.sinks.sink1.type=hdfs

#agent1.sinks.sink1.hdfs.path=hdfs://master:9000/library/flume

#agent1.sinks.sink1.hdfs.fileType=DataStream

#agent1.sinks.sink1.hdfs.writeFormat=TEXT

#agent1.sinks.sink1.hdfs.rollInterval=1

#agent1.sinks.sink1.channel=channel1

#agent1.sinks.sink1.hdfs.filePrefix=%Y-%m-%d

#Spark Streaming 捕获Flume的数据 SparkStreaming 监听的hostport

#agent1.sinks.sink1.type=avro

#agent1.sinks.sink1.channel=channel1

#agent1.sinks.sink1.hostname=Master

#agent1.sinks.sink1.port=9999

 

#Spark Streaming主动从Flume取数据,按需来取

agent1.sinks.sink1.type=org.apache.spark.streaming.flume.sink.SparkSink

agent1.sinks.sink1.hostname=Master

agent1.sinks.sink1.port=9999

agent1.sinks.sink1.channel=channel1

 

#配置channel1

agent1.channels.channel1.type=file

agent1.channels.channel1.checkpointDir=/usr/local/flume/tmp/checkpointDir

agent1.channels.channel1.dataDirs=/usr/local/flume/tmp/dataDirs

 

4.配置streaming代码

 

SparkConf conf = new SparkConf().setMaster("local[4]").setAppName("FlumePollDate2SparkStreaming");

 

JavaStreamingContext jsc = new JavaStreamingContext(conf, Durations.seconds(30));

 

JavaReceiverInputDStream lines = FlumeUtils.createPollingStream(jsc, "master", 9999);

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
下面是使用Spark Streaming接收Flume数据的代码实现: 1. 首先,需要在pom.xml文件中添加以下依赖: ```xml <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-flume_2.11</artifactId> <version>2.4.4</version> </dependency> ``` 2. 接着,在Spark Streaming应用程序中,创建一个Flume事件流,并将其与对应的Flume Agent进行连接: ```scala import org.apache.spark.streaming.StreamingContext import org.apache.spark.streaming.flume._ val ssc = new StreamingContext(sparkConf, Seconds(1)) val flumeStream = FlumeUtils.createStream(ssc, hostname, port) ``` 其中,hostname和port分别是Flume Agent的主机名和端口号。 3. 使用DStream的transform方法对接收到的数据进行处理: ```scala val events = flumeStream.map(event => new String(event.event.getBody.array())) val words = events.flatMap(_.split(" ")) val wordCounts = words.map(word => (word, 1)).reduceByKey(_ + _) ``` 4. 最后,使用DStream的print方法输出结果: ```scala wordCounts.print() ``` 完整代码示例: ```scala import org.apache.spark.streaming.{Seconds, StreamingContext} import org.apache.spark.streaming.flume._ val ssc = new StreamingContext(sparkConf, Seconds(1)) val flumeStream = FlumeUtils.createStream(ssc, hostname, port) val events = flumeStream.map(event => new String(event.event.getBody.array())) val words = events.flatMap(_.split(" ")) val wordCounts = words.map(word => (word, 1)).reduceByKey(_ + _) wordCounts.print() ssc.start() ssc.awaitTermination() ``` 注意:在实际应用中,需要根据实际情况设置合适的批处理间隔时间和Flume Agent的配置。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值