-
首先的去安装jdk,这是基础条件,本人采用JKD1.8。去oracle官网下载,或者链接:https://pan.baidu.com/s/1fzyWQ3IIg5hbvJaZ3EanPg 密码:1234
提取码:1234
复制这段内容后打开百度网盘手机App,操作更方便哦–来自百度网盘超级会员V3的分享下载64位的jdk,这点非常重要,由于本人之前安装的jdk是32位的,导致最后启动kafka时,报错 java.lang.OutOfMemoryError: Map failed。 -
卸载之前32位的jdk。win10安装和卸载Java jdk错误出现2503,2502错误:解决方案,采用360卸载。
-
安装已经下载的64位jdk,安装过程省略。安装完成后
-
下载zookeeper,zookeeper官网下载或者:https://pan.baidu.com/s/1ZGl84Ddc0L_JqNZ7QBDX5A 密码:1234 下面的zookeeper,
这个是本人配置好后的包。
-
启动zookeeper:找到zookeeper\apache-zookeeper-3.5.8-bin\bin\zKServer.cmd.
zookeeper 启动报 系统找不到指定的路径。 Error: JAVA_HOME is incorrectly set 错误的快速解决办法:zookeeper启动脚本3.编辑zkServer.cmd 文件将红框位置改为java
-
下载Kafka,kafka官网或者https://pan.baidu.com/s/1ZGl84Ddc0L_JqNZ7QBDX5A 密码:1234下面的Kafka,KafKa官网下载的话注意,有坑,必须下载只带bin目录的包,否则启动报错。
-
启动kafka 安装目录下命令行启动: .\bin\windows\kafka-server-start.bat .\config\server.properties。如果出现一闪而过:zkServer.cmd文件末尾添加pause排查具体错误。如果出现找不到指定文件请使用提供的包。
-
开始springboot整合Kafka
-
项目结构:
10.pom.xml依赖
<!--Kafka-->
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<!--slf-->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>1.7.25</version>
</dependency>
<!--jackson-->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.10.0</version>
</dependency>
<!--web-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
application.yml配置:
server:
port: 5003
spring:
kafka:
consumer:
bootstrap-servers: localhost:9092
# 配置消费者消息offset是否自动重置(消费者重连会能够接收最开始的消息)
auto-offset-reset: earliest
producer:
bootstrap-servers: localhost:9092
# 发送的对象信息变为json格式
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
kafka:
topic:
my-topic: my-topic
my-topic2: my-topic2
KafkaConfig配置类:
import org.apache.kafka.clients.admin.NewTopic;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.support.converter.RecordMessageConverter;
import org.springframework.kafka.support.converter.StringJsonMessageConverter;
@Configuration
public class KafkaConfig {
@Value("${kafka.topic.my-topic}")
String myTopic;
@Value("${kafka.topic.my-topic2}")
String myTopic2;
/**
* JSON消息转换器
* @return
*/
@Bean
public RecordMessageConverter jsonConverter() {
return new StringJsonMessageConverter();
}
/**
* 创建主题1
* @return
*/
@Bean
public NewTopic myTopic() {
return new NewTopic(myTopic, 2, (short) 1);
}
/**
* 创建主题2
* @return
*/
@Bean
public NewTopic myTopic2() {
return new NewTopic(myTopic2, 1, (short) 1);
}
}
消息实体:
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@AllArgsConstructor
@NoArgsConstructor
public class TestMessage {
private Long id;
private String name;
}
生产者:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.support.SendResult;
import org.springframework.stereotype.Service;
import org.springframework.util.concurrent.ListenableFuture;
import org.springframework.util.concurrent.ListenableFutureCallback;
@Service
public class ProducerService {
private static final Logger logger = LoggerFactory.getLogger(ProducerService.class);
private final KafkaTemplate<String, Object> _kafkaTemplate;
public ProducerService(KafkaTemplate<String, Object> kafkaTemplate) {
this._kafkaTemplate = kafkaTemplate;
}
public void sendMessage(String topic, Object o) {
//发送消息
ListenableFuture<SendResult<String, Object>> future = _kafkaTemplate.send(topic, o);
//回调
future.addCallback(new ListenableFutureCallback<SendResult<String, Object>>() {
//成功回调
@Override
public void onSuccess(SendResult<String, Object> sendResult) {
logger.info("生产者成功发送消息到" + topic + "-> " + sendResult.getProducerRecord().value().toString());
}
//失败回调
@Override
public void onFailure(Throwable throwable) {
logger.error("生产者发送消息:{} 失败,原因:{}", o.toString(), throwable.getMessage());
}
});
}
消费者:
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.kafka.demo.Entity.TestMessage;
import com.kafka.demo.KafKaService.ProducerService;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;
@Service
public class ConsumerService {
@Value("${kafka.topic.my-topic}")
private String myTopic;
@Value("${kafka.topic.my-topic2}")
private String myTopic2;
private final Logger logger = LoggerFactory.getLogger(ProducerService.class);
private final ObjectMapper objectMapper = new ObjectMapper();
//第一个消费者
@KafkaListener(topics = {"${kafka.topic.my-topic}"}, groupId = "group1")
public void consumeMessage(ConsumerRecord<String, String> bookConsumerRecord) {
try {
TestMessage book = objectMapper.readValue(bookConsumerRecord.value(), TestMessage.class);
logger.info("消费者消费topic:{} partition:{}的消息 -> {}", bookConsumerRecord.topic(), bookConsumerRecord.partition(), book.toString());
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
//消费者2
@KafkaListener(topics = {"${kafka.topic.my-topic2}"}, groupId = "group2")
public void consumeMessage2(TestMessage book) {
logger.info("消费者消费{}的消息 -> {}", myTopic2, book.toString());
}
定义一个Controller方法:
import com.kafka.demo.Entity.TestMessage;
import com.kafka.demo.KafKaService.ProducerService;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import java.util.concurrent.atomic.AtomicLong;
@RestController
@RequestMapping(value = "/test")
public class BookController {
@Value("${kafka.topic.my-topic}")
String myTopic;
@Value("${kafka.topic.my-topic2}")
String myTopic2;
private final ProducerService producer;
private AtomicLong atomicLong = new AtomicLong();
BookController(ProducerService producer) {
this.producer = producer;
}
@PostMapping
public void sendMessageToKafkaTopic(@RequestParam("name") String name) {
this.producer.sendMessage(myTopic, new TestMessage(atomicLong.addAndGet(1), name));
this.producer.sendMessage(myTopic2, new TestMessage(atomicLong.addAndGet(1), name));
}
测试:
结果: