背景
Kafka 本身可以存储数据,在实际的开发中,经常遇到需要重新消费数据。Kafka 消费方式非常的灵活,可以指定Partition,offset 进行重新消费
代码
Properties props = new Properties();
props.put("bootstrap.servers", ":9096");
props.put("group.id", "test2121");
props.put("enable.auto.commit", "true");
props.put("auto.commit.interval.ms", "1000");
props.put("session.timeout.ms", "30000");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
KafkaConsumer<String, String> kafkaConsumer = new KafkaConsumer<String, String>(props);
String topic = "com.auditlog.all";
// 获取topic的partition信息
TopicPartition topicPartition = new TopicPartition(topic, 3);
// 5.给Consumer指定消费的Topic和Partition(十分重要)
// 如果无对应topic或者partition,则会抛出异常IllegalArgumentException
// 如果此consumer之前已经有过订阅行为且未解除之前所有的订阅,则会抛出异常IllegalStateException
kafkaConsumer.assign(Collections.singletonList(topicPartition));
// 6.覆盖原始的Consumer-Topic-Partition对应的Offset,将其设置为指定Offset值
kafkaConsumer.seek(topicPartition, 2025000585);
while (true) {
ConsumerRecords<String, String> records = kafkaConsumer.poll(100);
for (ConsumerRecord<String, String> record : records) {
}
}
}
总结
Kafka的重新消费方式非常的灵活,还可以支持根据指定时间消费。具体可以参加文章 Kafka 根据指定时间消费数据