Flink入门(十二)自定义eventTime

kafka消息是自带timestamp的,但有的时候需要自定义eventTime,直接上代码

        final StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment();
        env.setParallelism(2);
        //这里我采用eventTime
        env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
        //接收kafka消息
        Properties propsConsumer = new Properties();
        propsConsumer.setProperty("bootstrap.servers", KafkaConfig.KAFKA_BROKER_LIST);
        propsConsumer.setProperty("group.id", "trafficwisdom-streaming");
        propsConsumer.put("enable.auto.commit", false);
        propsConsumer.put("max.poll.records", 1000);
        FlinkKafkaConsumer011<String> consumer = new FlinkKafkaConsumer011<String>("topic-test", new SimpleStringSchema(), propsConsumer);
        consumer.setStartFromLatest();
        DataStream<String> stream = env.addSource(consumer);
        stream.print();
		//自定义,消息中dateTime作为eventTime
        DataStream<String>   streamResult= stream.assignTimestampsAndWatermarks(new AscendingTimestampExtractor<String>() {
            @Override
            public long extractAscendingTimestamp(String element) {
                JSONObject jsonObject = JSON.parseObject(element);
                Long dateTime = jsonObject.getLong("dateTime");
                return dateTime;
            }
        });

		//下面测试窗口
        DataStream<Tuple2<String,Integer>>  finalResult=streamResult.map(new MapFunction<String, Tuple2<String, Integer>>() {
            @Override
            public Tuple2<String, Integer> map(String value) throws Exception {
                JSONObject jsonObject = JSON.parseObject(value);
                String key=jsonObject.getString("key");
                return Tuple2.of(key,1);
            }
        });

        DataStream<Tuple2<String,Integer>> aggResult =finalResult.keyBy(0).window(SlidingEventTimeWindows.of(Time.seconds(60*60), Time.seconds(30)))
                .allowedLateness(Time.seconds(10))
                .sum(1);
        aggResult.print();
        env.execute("WaterMarkDemo");

下面测试说明,如果发两条消息:

{"dateTime":1558925900000,"key":"test1"}    //2019-05-27 10:58:20
{"dateTime":1558925920000,"key":"test1"}    //2019-05-27 10:58:40

结果:

1> {"dateTime":1558925900000,"key":"test1"} 
1> {"dateTime":1558925920000,"key":"test1"}
1> (test1,1)

紧接着发送消息

{"dateTime":1558925960000,"key":"test5"}   //2019-05-27 10:59:20

结果出现

1> {"dateTime":1558925960000,"key":"test5"}
1> (test1,2)

为什么第一次统计结果不是(test1,2),
看了org.apache.flink.streaming.api.windowing.triggers.EventTimeTrigger中
ctx.registerEventTimeTimer(window.maxTimestamp())方法,取end-1(就是1558925920000-1),所以没把第二条算进去。
在这里插入图片描述在这里插入图片描述


重新启动demo,重新发送消息
发送两条信息:

{"dateTime":1558925900000,"key":"test1"}  //2019-05-27 10:58:20
{"dateTime":1558925960000,"key":"test1"}  //2019-05-27 10:59:20

结果:

1> {"dateTime":1558925900000,"key":"test1"}
1> {"dateTime":1558925960000,"key":"test1"}
1> (test1,1)
1> (test1,1)

出现两次统计结果,因为我是window窗口30秒触发一次


重新启动demo,发送消息:


{"dateTime":1558925900000,"key":"test1"}  //2019-05-27 10:58:20
{"dateTime":1558926140000,"key":"test1"}  //2019-05-27 11:02:20

结果

1> {"dateTime":1558925900000,"key":"test1"}
1> {"dateTime":1558926140000,"key":"test1"} 
1> (test1,1)
1> (test1,1)
1> (test1,1)
1> (test1,1)
1> (test1,1)
1> (test1,1)
1> (test1,1)
1> (test1,1)

5分钟30s一次,正好10次统计结果。

  • 2
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 3
    评论
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值