使用@KafkaListener注解时,可以一个注解指定消费多个topic
topic的参数需要是常量,我们可以用以下两种方法从配置参数中获取
1:在配置文件中获取
//配置在yml文件或者properties文件中
topics: "admin,login,client"
@KafkaListener(topics = "#{'${topics}'.split(',')}",concurrency = "#{'${topics}'.split(',').length}")
2:在配置类中获取
//创建一个类
@Component
public class TopicHandler {
String [] KAFKA_TOPICS = {"admin","login","client"};
//调用此方法获取topic参数,参数从哪来自己在方法里实现
public String[] getTopics(){
return KAFKA_TOPICS;
}
}
@KafkaListener(topics = "#{topicHandler.getTopics()}",concurrency = "#{topicHandler.getTopics().length}")
具体使用方式如下
//@KafkaListener(topics = "#{topicHandler.getTopics()}",concurrency = "#{topicHandler.getTopics().length}")
@KafkaListener(topics = "#{'${topics}'.split(',')}",concurrency = "#{'${topics}'.split(',').length}")
public void logs(List<ConsumerRecord<?, ?>> records){
for (ConsumerRecord<?, ?> record : records) {
System.out.println(record.topic());//获取topic信息
System.out.println(record);//获取record对象信息,包含除值外的信息
System.out.println(record.value().toString());//实际需要的值数据
}
}
打印值如下
admin
ConsumerRecord(topic = admin, partition = 0, leaderEpoch = 2, offset = 136201, CreateTime = 1649400069032, serialized key size = -1, serialized value size = 1805, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value = {"account":"admin","name":"管理员","sip":"168.168.168.131","method":"POST","operation":"角色分页查询","code":"002143901","@timestamp":"2022-04-08T06:40:43Z","@mtime":1649400043559})
{"account":"admin","name":"管理员",,"sip":"168.168.168.131","method":"POST","operation":"角色分页查询","code":"002143901","@timestamp":"2022-04-08T06:40:43Z","@mtime":1649400043559}