使用CDH5.15.1版本,搭建kafka 3.1.1-1.3.1.1。消费者拉取数据失败的原因
只设置了一个broker。
解决:设置多个
生产者,消费者代码:kafka与spring boot整合
maven:
<dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-streams</artifactId> <version>1.0.1</version> </dependency> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> <version>2.1.7.RELEASE</version> </dependency>
application.properties
#=============== provider ======================= spring.kafka.producer.retries=0 # 每次批量发送消息的数量 spring.kafka.producer.batch-size=16384 spring.kafka.producer.buffer-memory=33554432 # 指定消息key和消息体的编解码方式 spring.kafka.producer.bootstrap-servers=192.168.25.128:9092,192.168.25.129:9092,192.168.25.130:9092 spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer #=============== consumer ======================= # 指定默认消费者group id #spring.kafka.consumer.bootstrap-servers=192.168.25.128:9092 spring.kafka.consumer.bootstrap-servers=192.168.25.128:9092,192.168.25.129:9092,192.168.25.130:9092 spring.kafka.consumer.group-id=test-consumer-group spring.kafka.consumer.auto-offset-reset=earliest spring.kafka.consumer.enable-auto-commit=true spring.kafka.consumer.auto-commit-interval=100 # 指定消息key和消息体的编解码方式 spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer
message:
@Data public class Message { private Long id; //id private String msg; //消息 private Date sendTime; //时间戳 public Long getId() { return id; } public void setId(Long id) { this.id = id; } public String getMsg() { return msg; } public void setMsg(String msg) { this.msg = msg; } public Date getSendTime() { return sendTime; } public void setSendTime(Date sendTime) { this.sendTime = sendTime; } }
生产者:
public void send3() { Message message = new Message(); message.setId(System.currentTimeMillis()); message.setMsg(UUID.randomUUID().toString()); message.setSendTime(new Date()); System.out.println(gson.toJson(message)); kafkaTemplate.send("I", gson.toJson(message)); }
消费者:
@Component @Slf4j public class KafkaReceiver { @KafkaListener(topics = {"I"}) public void listen(ConsumerRecord<?, ?> record) throws Exception { Optional<?> kafkaMessage = Optional.ofNullable(record.value()); if (kafkaMessage.isPresent()) { Object message = kafkaMessage.get(); System.out.println("----------------- record =" + record); System.out.println("------------------ message =" + message); } } }