[realms]
HADOOP.COM = {
kdc = hdp-1:88
admin_server = hdp-1:749
default_domain = HADOOP.COM
}
[domain_realm]
.HADOOP.COM = HADOOP.COM
HADOOP.COM = HADOOP.COM
Tip:JDK11版本 sun.security.krb5.Config 类有修改,不去掉会有如下报错:
Caused by: KrbException: krb5.conf loading failed
![](https://img-blog.csdnimg.cn/01b665dbfdcf4b47a87771d943c87c97.png)
readConfigFileLines方法:![](https://img-blog.csdnimg.cn/7660c5600c934f9091e4f6ea924088de.png)
#### 二、修改hosts文件
192.168.16.14 hdp-1
#### 三、根据自己的kafka版本引入依赖
<!-- 需要引入与所安装的kafka对应版本的依赖 -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>3.1.0</version>
</dependency>
#### 四、生产者样例代码
package com.example.demo.kafka;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
import java.util.Properties;
/**
-
@Author: meng
-
@Version: 1.0
*/
public class ProductKafkaKerberos {public static void main(String[] args) {