- 回调函数会在 producer 收到 ack 时调用,为异步调用,该方法有两个参数,分别是元
数据信息(RecordMetadata)和异常信息(Exception),如果 Exception 为 null,说明消息发
送成功,如果 Exception 不为 null,说明消息发送失败。 - 带callback的代码
package com.atguigu.kafka.controller;
import org.apache.kafka.clients.producer.*;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.Properties;
@RestController
@RequestMapping("kafka")
public class CustomProducerCallbackController {
@RequestMapping("sendMsgwithCallback")
public void sendMsg(){
Properties properties = new Properties();
properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"localhost:9092");
properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,StringSerializer.class.getName());
KafkaProducer<String, String> kafkaProducer = new KafkaProducer<>(properties);
for (int i = 0; i < 500; i++) {
kafkaProducer.send(new ProducerRecord<>("ums-cardbin", "学习kafka的消息生产者发送消息" + i), new Callback() {
@Override
public void onCompletion(RecordMetadata metadata, Exception exception) {
if (exception == null){
System.out.println("主题: "+metadata.topic() + " 分区: "+ metadata.partition());
}
}
});
try {
Thread.sleep(2);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
kafkaProducer.close();
}
}
上面说的异步只是单纯向topic中推送消息是异步的。
注意:消息发送失败会自动重试,不需要我们在回调函数中手动重试。
- 同步发送消息流程
所谓的同步,指的是缓存队列中的所有的消息被send线程发送至broker后,下一批次的消息才会被写入缓存的队列中.
package com.atguigu.kafka.controller;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.Properties;
import java.util.concurrent.ExecutionException;
@RestController
@RequestMapping("kafka")
public class CustomProducerSyncCopntroller {
@RequestMapping("sendMsgwithSync")
public void sendMsg(){
Properties properties = new Properties();
properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"localhost:9092");
properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,StringSerializer.class.getName());
KafkaProducer<String, String> kafkaProducer = new KafkaProducer<>(properties);
for (int i = 0; i < 5; i++) {
try {
kafkaProducer.send(new ProducerRecord<>("ums-cardbin","atguigu"+i)).get();
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
}
kafkaProducer.close();
}
}