由于是大数据,打日志相对麻烦,使用玄坤大法,选点打日志,才把这个bug纠出来。
使用库:sarama
addrs := strings.Split(config.Kafka.RealBroker, ",")
kafkaConfig := sarama.NewConfig()
kafkaConfig.Producer.Timeout = 3
kafkaConfig.Producer.Return.Errors = false //设置成false,防止失败过多,没有消费导致阻塞,而且不是必现,极难排查
kafkaConfig.Producer.Return.Successes = false
kafkaConfig.Producer.MaxMessageBytes = 50 * 1024 * 1024
producer, err := sarama.NewAsyncProducer(addrs, kafkaConfig)
if err != nil {
log.Println("new async producer err: ", err)
_,file,line,_ := runtime.Caller(0)
dingding.PushMessage(fmt.Sprintf("new async producer err:%v file:%s line:%d",err,file,line))
return err
}
conn.KafkaProducer = producer
如果kafkaConfig.Producer.Return.Errors和kafkaConfig.Producer.Return.Successes设置为true,则一定要有相应的消费channel的代码,否则可能发生消息阻塞。