Springboot + Docker 本地集成 kafka Demo
前提
- 已安装docker,docker-compose
实验环境
- 系统: Ubuntu18.04
- Springboot: 2.0.4
- docker版本: 18.06.1-ce
- docker-compose版本: 1.23.1
- maven: 3.5.4
- jdk: 1.8.0_181
Docker搭建kafka及zookeeper
-
拉取镜像(使用docker-compose启动可省略此步):
docker pull wurstmeister/kafka docker pull zookeeper
-
在某个目录中新建
docker-compose.yml
,内容如下:version: '2' services: zoo1: image: zookeeper restart: always hostname: zoo1 ports: - 2181:2181 environment: ZOO_MY_ID: 1 kafka1: image: wurstmeister/kafka links: - zoo1:zk ports: - 9092:9092 environment: KAFKA_ADVERTISED_HOST_NAME: 127.0.0.1 KAFKA_ZOOKEEPER_CONNECT: zk:2181 volumes: - /var/run/docker.sock:/var/run/docker.sock depends_on: - zoo1
-
执行命令:
docker-compose up -d
-
之后应该有如下输出,则启动成功:
Creating network "zookeeper_kafka_default" with the default driver Creating zookeeper_kafka_zoo1_1_86ebb921f61e ... done Creating zookeeper_kafka_kafka1_1_a94c17a049d9 ... done
-
tips: 可以只使用docker,命令参照
docker-compose.yml
文件
新建名为kafka-demo
的SpringBoot
项目
-
引入依赖,如下:
<dependencies> <dependency> <groupId>com.alibaba</groupId> <artifactId>fastjson</artifactId> <version>1.2.47</version> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <optional>true</optional> </dependency> </dependencies>
-
application.yml
文件中不需要配置kafka,默认即可,如下server: port: 21118 spring: application: name: kafka-demo
-
新建
DemoService
,用于生产消息,如下@Slf4j @Service public class DemoService { private final KafkaTemplate<String, String> kafkaTemplate; @Autowired public DemoService(KafkaTemplate<String, String> kafkaTemplate) { this.kafkaTemplate = kafkaTemplate; } public void send() { MessageDto message = new MessageDto(); message.setId(System.currentTimeMillis()); message.setMsg(UUID.randomUUID().toString()); message.setSendTime(new Date()); log.info("sent message: {}", JSON.toJSONString(message)); kafkaTemplate.send("topic1", JSON.toJSONString(message)); } }
其中
KafkaTemplate
可直接注入,使用,当然也可以自定义配置类进行配置其中
kafkaTemplate#send
方法第一个参数是topic
,概念参照kafka,第二个参数是消息内容 -
新建
ReceiverService
消费消息,如下:@Service @Slf4j public class ReceiverService { @KafkaListener(topics = {"topic1"}, groupId = "1") public void receive(ConsumerRecord<?, ?> record) { Optional<?> message = Optional.ofNullable(record.value()); message.ifPresent(msg -> log.info("message: {}", msg)); } }
注意需要指定
@KafkaListener
的topics
值和发送端的相同,且指定groupId
的值(自定义) -
新建
Controller
,如下@RestController @RequestMapping("/v1/messages") public class KafkaController { private final DemoService demoService; @Autowired public KafkaController(DemoService demoService) { this.demoService = demoService; } @PostMapping() public String post() { demoService.send(); return "message"; } }
-
默认生成的
DemoApplication
类不进行改动,如下:@SpringBootApplication public class KafkaDemoApplication { public static void main(String[] args) { SpringApplication.run(KafkaDemoApplication.class, args); } }
验证
-
启动项目
-
向指定接口发送post请求
-
使用
postman
-
使用
curl
:curl -X POST "http://127.0.0.1:21118/v1/messages"
-
-
查看程序日志,看到发送的日志和接收的日志,成功
出错的可能原因
- docker端口映射不对,或者容器状态不健康,修复并重启