flink 读取kafka数据批量写mysql
需求
从kafka中消费数据,处理后写入mysql
读取kafka数据
// 配置kafka参数
Properties prop = new Properties();
prop.put("bootstrap.servers", "localhost:9092");
prop.put("zookeeper.connect", "localhost:2181");
prop.put("group.id", "test_group");
prop.put("auto.offset.reset", "latest");
prop.put("enable.auto.commit", true);
prop.put("zookeeper.connection.timeout.ms","30000");
// 添加source
DataStream dataStreamSource = env.addSource(new FlinkKafkaConsumer010<>("topic_name", new SimpleStringSchema(), prop));
使用滚动窗口收集数据
一次sink会调用一次invoke。如果要实现批量写,在sink前将数据收集一下。
dataStreamSource.timeWindowAll(Time.seconds(60)).apply(new AllWindowFunction<Object, List<Object>, TimeWindow>() {
@Override
public void apply(TimeWindow window, Iterable<Object> values, Collector<List<Object>> out) throws Exception {
ArrayList<Object> list = Lists.newArrayList(values);
if (list.size() > 0) {
out.collect(list);
}
}
})
批量写入mysql
dataStreamSource.timeWindowAll(Time.seconds