Springboot+Kafka实现队列消息
软件支持
- JDK1.8
- Zookeeper-3.5.6
- Kafka_2.12-2.3.1
- Springboot-2.2.1.RELEASE
- IDEA 2019.2.4
以上软件请自行安装
启动Zookeeper
修改配置文件
把apache-zookeeper-3.5.6-bin文件夹下conf文件夹中的zoo_sample.cfg复制一份,重命名为zoo.cfg
启动powershell
在apache-zookeeper-3.5.6-bin文件夹下,按住SHIFT+鼠标右键,打开powershell或者cmd窗口,输入下面的代码
.\bin\zkServer.cmd
示例:
启动Kafka
修改配置文件
启动服务
在kafka_2.12-2.3.1文件夹下,按住SHIFT+鼠标右键,打开powershell或者cmd窗口,输入下面的代码
.\bin\windows\kafka-server-start.bat config\server.properties
示例:
创建Topic
在kafka_2.12-2.3.1文件夹下开启一个新的powershell窗口,输入下面的代码:
.\bin\windows\kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic springboot-kafka
此时,我们创建了一个名称为springboot-kafka的topic,接下来我们编写简单队列的Demo
Springboot创建项目
Maven引入Kafka
pom文件
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.1.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.pengo</groupId>
<artifactId>springboot-kafka</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>springboot-kafka</name>
<description>A Demo project for Springboot-kafka</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<!--Kafka-->
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<!--Lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
配置application
## kafka ##
spring.kafka.bootstrap-servers=127.0.0.1:9092
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.consumer.group-id=test
spring.kafka.consumer.enable-auto-commit=true
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer
#topic名
spring.kafka.topic=springboot-kafka
生产者
@Slf4j
@Component
public class Producer {
@Autowired
private KafkaTemplate kafkaTemplate;
@Value("${spring.kafka.topic}") //引入application.properties中的数据
private String topics;
public void send(String jsonString) {
log.info("待发送的消息:{}", jsonString);
kafkaTemplate.send(topics, jsonString);
}
}
消费者
@Slf4j
@Component
public class Consumer {
@KafkaListener(topics = "${spring.kafka.topic}")
public void receiver(ConsumerRecord record) {
log.info("接收到的数据:topic={},message={}", record.topic(), record.value());
}
}
测试
先启动springboot服务,再启动测试代码
@SpringBootTest
class SpringbootKafkaApplicationTests {
@Autowired
private Producer producer;
@Test
void contextLoads() {
for (int i = 0; i < 10; i++) {
producer.send("发送第" + (i + 1) + "条消息!");
}
}
}
测试结果
测试窗口数据
Springboot窗口数据