我们在使用Flink时,如果遇到了Hadoop环境有Kerberos认证,可以直接在flink-conf.yaml文件中添加配置
security.kerberos.login.contexts: Client,KafkaClient
security.kerberos.login.use-ticket-cache: true
security.kerberos.login.keytab: /etc/security/keytab/hdfs.keytab
security.kerberos.login.principal: hdfs@HADOOP.COM
env.java.opts: -Djava.security.krb5.conf=/etc/krb5.conf -Djava.security.auth.login.config=hdfs://mycluster/tmp/kafka_jaas.conf