SpringBoot集成kafka(生产者和消费者),自定义kafka的value序列化方式(附完整demo)

 

Windows下kafka的安装参考文章:https://blog.csdn.net/github_38482082/article/details/82112641

maven依赖:

    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>

        <!--kafka依赖配置-->
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
            <version>2.2.0.RELEASE</version>
        </dependency>

        <!--fastjson依赖配置-->
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>fastjson</artifactId>
            <version>1.2.4</version>
        </dependency>
    </dependencies>

自定义value的反序列化方式

import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.JSONObject;
import org.apache.kafka.common.serialization.Deserializer;

import java.util.Map;

/**
 * 自定义kafkaz中JSOBObject类型的反序列化方式
 */
public class JsonDeserializer implements Deserializer<JSONObject> {

    @Override
    public void configure(Map<String, ?> configs, boolean isKey) {

    }

    @Override
    public JSONObject deserialize(String topic, byte[] data) {
        return JSON.parseObject(data,JSONObject.class);
    }

    @Override
    public void close() {

    }
}

自定义value的序列化方式:

import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.JSONObject;
import org.apache.kafka.common.serialization.Serializer;


import java.util.Map;

/**
 * 自定义kafka中JSONObject类型的序列化方式
 */
public class JsonSerializer implements Serializer<JSONObject> {

    @Override
    public void configure(Map<String, ?> configs, boolean isKey) {

    }

    @Override
    public byte[] serialize(String topic, JSONObject data) {
        return JSON.toJSONBytes(data);
    }

    @Override
    public void close() {

    }
}

将自定义的序列化与反序列化方式配置到yml文件中,当然也可以使用kafka提供的

org.apache.kafka.common.serialization.StringDeserializer序列化方式,根据自己的需要选择

#端口
server:
  port: 8082

#数据库
spring:
  #datasource:
   # url:
    #username:
    #password:
  kafka:
    bootstrap-servers: 127.0.0.1:9092
    consumer:
      group-id: 0
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      #value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: com.example.kafka_example.JsonDeserializer
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      #value-deserializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: com.example.kafka_example.JsonSerializer
      batch-size: 65536
      buffer-memory: 524288

kafka生产者,使用springboot封装的KafkaTemplate进行操作

import com.alibaba.fastjson.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Component;
import org.springframework.util.concurrent.FailureCallback;
import org.springframework.util.concurrent.ListenableFuture;
import org.springframework.util.concurrent.SuccessCallback;

/**
 * kafka生产者
 */
@Component
public class KafkaProducer {
    private static final Logger LOGGER= LoggerFactory.getLogger("KafkaProducer.class");

    @Autowired
    private KafkaTemplate kafkaTemplate;

    public void sendMessage(String channel, JSONObject message){
        ListenableFuture future=kafkaTemplate.send(channel,message);
        LOGGER.info("kafka生产者发送信息到"+channel);
        future.addCallback(new SuccessCallback() {
            @Override
            public void onSuccess(Object o) {
                LOGGER.info("发送成功,channel:" + channel + ",message:" + message);
            }
        }, new FailureCallback() {
            @Override
            public void onFailure(Throwable throwable) {
                LOGGER.info("发送异常:"+throwable);
            }
        });
    }
}

kafka消费者,监听kfk_test的topic

import com.alibaba.fastjson.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Component;

/**
 * kafka消费者
 */
@Component
public class KafkaConsumer {

    private static final Logger LOGGER=LoggerFactory.getLogger("kafkaConsumer.class");

    @KafkaListener(topics = {"kfk_test"})
    public  void receiveDeviceData(JSONObject message){

        LOGGER.info("接收kafka中的信息:"+message);

    }


}

做一个实现ApplicationRunner的类,工程启动后自动调用生产者,实际应用中在需要的地方调用生产者即可

import com.alibaba.fastjson.JSONObject;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.ApplicationArguments;
import org.springframework.boot.ApplicationRunner;
import org.springframework.stereotype.Service;

@Service
//继承ApplicationRunner接口,工程启动后自动运行
public class TimerService implements ApplicationRunner {

    @Autowired
    private KafkaProducer kafkaProducer;

    @Override
    public void run(ApplicationArguments args) throws Exception {
        JSONObject message = new JSONObject();
        message.put("test","生产者发送的数据");

        kafkaProducer.sendMessage("kfk_test",message);



    }
}
启动类:
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class KafkaExampleApplication {

    public static void main(String[] args) {
        SpringApplication.run(KafkaExampleApplication.class, args);
    }

}

运行结果:

 

 

 代码:https://github.com/NattyUnicorn/springboot_kafka_example.git

 

  • 2
    点赞
  • 9
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
要在springboot集成kafka生产者,需要遵循以下步骤: 1. 添加Maven依赖 在pom.xml文件中添加以下依赖: ```xml <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> <version>2.7.2</version> </dependency> ``` 2. 配置Kafka生产者 在application.properties文件中添加Kafka的配置: ```properties spring.kafka.producer.bootstrap-servers=<broker地址> spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer ``` 3. 创建Kafka生产者 在代码中创建Kafka生产者: ```java import org.springframework.beans.factory.annotation.Autowired; import org.springframework.kafka.core.KafkaTemplate; import org.springframework.stereotype.Service; @Service public class KafkaProducerService { private final KafkaTemplate<String, String> kafkaTemplate; @Autowired public KafkaProducerService(KafkaTemplate<String, String> kafkaTemplate) { this.kafkaTemplate = kafkaTemplate; } public void sendMessage(String topic, String message) { this.kafkaTemplate.send(topic, message); } } ``` 4. 发送消息 在需要发送消息的地方,注入KafkaProducerService,并调用sendMessage方法: ```java import org.springframework.beans.factory.annotation.Autowired; import org.springframework.web.bind.annotation.PostMapping; import org.springframework.web.bind.annotation.RequestBody; import org.springframework.web.bind.annotation.RestController; @RestController public class MessageController { private final KafkaProducerService kafkaProducerService; @Autowired public MessageController(KafkaProducerService kafkaProducerService) { this.kafkaProducerService = kafkaProducerService; } @PostMapping("/message") public void sendMessage(@RequestBody String message) { this.kafkaProducerService.sendMessage("test-topic", message); } } ``` 以上就是在springboot集成Kafka生产者的步骤。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值