JAVA kerberos认证 读写kafka

1 producer直接上代码:

public class JavaKafkaWithKerberos {
    public static void main(String[] args) throws  Exception{


        System.setProperty("java.security.auth.login.config", "d:/myconf/jaas.conf");
        System.setProperty("java.security.krb5.conf", "d:/myconf/krb5.conf");

//        System.setProperty("java.security.auth.login.config", "/tmp/myconf/jaas.conf");
//        System.setProperty("java.security.krb5.conf", "/tmp/myconf/krb5.conf");
        Properties props = new Properties();
        props.put("bootstrap.servers", "1000000000000:9092,000000000000:9092,0000000000000:9092");
        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        //props.put("value.serializer", "org.apache.kafka.common.serialization.ByteArraySerializer");
        //props.put("value.serializer", "org.apache.kafka.common.serialization.ByteArraySerializer");
        props.put("security.protocol", "SASL_PLAINTEXT");
        props.put("sasl.mechanism", "GSSAPI");
        props.put("sasl.kerberos.service.name", "kafka");

        KafkaProducer<String,String> producer = new KafkaProducer<>(props);

        for (int i = 0; i < 10000; i++){
            String s = UUID.randomUUID().toString() +" " + i + " Test Date: " + new Date();
            System.out.println(s);
            producer.send(new ProducerRecord<>("sink-topic",s ));// 仅V,    K为null
            Thread.sleep(1000);
        }

    }
}

 

 

2 consumer直接上代码

public class JavaKafkaConsumedrWithKerberos {
    public static void main(String[] args) throws  Exception{

        System.setProperty("java.security.auth.login.config", "d:/myconf/jaas.conf");
        System.setProperty("java.security.krb5.conf", "d:/myconf/krb5.conf");

//        System.setProperty("java.security.auth.login.config", "/tmp/myconf/jaas.conf");
//        System.setProperty("java.security.krb5.conf", "/tmp/myconf/krb5.conf");
        Properties props = new Properties();
        props.put("bootstrap.servers", "000000000000:9092,000000000000:9092,000000000000:9092");

        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        //props.put("value.serializer", "org.apache.kafka.common.serialization.ByteArraySerializer");
        //props.put("value.serializer", "org.apache.kafka.common.serialization.ByteArraySerializer");

        props.put("group.id", "test"); 
        
        
        //kerberos安全认证
        props.put("security.protocol", "SASL_PLAINTEXT");
        props.put("sasl.mechanism", "GSSAPI");
        props.put("sasl.kerberos.service.name", "kafka");

        KafkaProducer<String,String> producer = new KafkaProducer<>(props);

        KafkaConsumer kafkaConsumer = new KafkaConsumer<>(props);
        //kafkaConsumer.subscribe(Arrays.asList("cust_info"));(Collections.singletonList("consumerCodeTopic")
        kafkaConsumer.subscribe(Collections.singletonList("test"));

        while (true) {
            ConsumerRecords<String, String> records = kafkaConsumer.poll(1);
            for (ConsumerRecord<String, String> record : records)
                System.out.println("Partition: " + record.partition() + " Offset: " + record.offset() + " Value: " + record.value() + " ThreadID: " + Thread.currentThread().getId());

        }

    }
}

 

3 配置文件:jaas.conf

KafkaClient {  
    com.sun.security.auth.module.Krb5LoginModule required  
    useKeyTab=true  
    keyTab="d:/myconf/hadoop.keytab"  
    storeKey=true  
    useTicketCache=false  
    serviceName="kafka"
    principal="hadoop/cdh@HADOOP.COM"; 
};
Client {  
    com.sun.security.auth.module.Krb5LoginModule required  
    useKeyTab=true  
    storeKey=true 
    useTicketCache=false  
    serviceName="kafka"
    keyTab="d:/myconf/hadoop.keytab"  
    principal="hadoop/cdh@HADOOP.COM";  
};

 

注:上述代码可以不用在job运行时,执行kinit -kt keytab认证。

 

如果没有指定keytab路径,在运行job前,执行kinit -kt /home/hadoop/hadoop.keytab也可以达到目的。

要实现Spring Boot整合Kerberos认证来使用Kafka,你可以按照以下步骤进行操作: 1. 在启动类中添加Kafka相关配置。你可以使用`@EnableKafka`注解来启用Kafka,同时在`application.properties`文件中配置Kafka的连接信息。 2. 创建一个`kafka_client_jaas.conf`文件,其中配置Kafka的客户端认证信息。例如,你可以使用以下配置: ``` KafkaClient { org.apache.kafka.common.security.plain.PlainLoginModule required username="bob" password="bob-pwd"; }; ``` 将此文件放置在合适的位置,例如`C:/Users/JustryDeng/Desktop/kerberos/kafka_client_jaas.conf`。 3. 在启动类中,通过设置系统环境属性来指定`java.security.auth.login.config`参数为`kafka_client_jaas.conf`文件的路径。例如: ``` private static void systemPropertiesConfig(){ System.setProperty("java.security.auth.login.config", "C:/Users/JustryDeng/Desktop/kerberos/kafka_client_jaas.conf"); } ``` 这样,Kafka客户端将会使用指定的认证配置进行连接和认证。 通过以上步骤,你可以实现Spring Boot整合Kerberos认证来使用Kafka。请确保按照指定的路径创建和配置`kafka_client_jaas.conf`文件,并在启动类中设置正确的系统环境属性。<span class="em">1</span><span class="em">2</span><span class="em">3</span> #### 引用[.reference_title] - *1* *2* *3* [SpringBoot整合并简单使用Kerberos认证Kafka](https://blog.csdn.net/justry_deng/article/details/88387898)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 100%"] [ .reference_list ]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值