- kafka版本:0.11
- flink版本:1.9
FlinkKafkaConsumer011中提供了很多方法确定如何消费kafka消息:
- setStartFromTimestamp
- setStartFromEarliest
- setStartFromLatest
- setStartFromSpecificOffsets
- setStartFromGroupOffsets
需要根据时间消费kafka消息,调用setStartFromTimestamp方法就行。
public class StreamingJob {
public static void main(String[] args) throws Exception {
// set up the streaming execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties prop = new Properties();
prop.put("bootstrap.servers", "localhost:9092");
prop.put("group.id", "flink-streaming-job");
FlinkKafkaConsumer011<String> consumer = new FlinkKafkaConsumer011<>("start_log", new SimpleStringSchema(), prop);
//指定Timestamp位置开始消费kafka数据
consumer.setStartFromTimestamp(1571909309022L);
//source
DataStream source = env.