Kafka安全认证 SASL/PLAINTEXT,账号密码认证
一.背景
kafka提供了多种安全认证机制,主要分为SSL和SASL2大类。其中SASL/PLAIN是基于账号密码的认证方式,比较常用。最近做了个kafka的鉴权,发现官网上讲的不是很清楚,网上各种博客倒是很多,但是良莠不齐,巨多坑。经过一天的研究,终于搞定了,特在此记录下。
一、Zookeeper集群配置SASL
zookeeper所有节点都是对等的,只是各个节点角色可能不相同。以下步骤所有的节点配置相同。
1、zoo.cfg文件配置
为zookeeper添加SASL支持,在配置文件zoo.cfg添加
authProvider.1=org.apache.zookeeper.server.auth.SASLAuthenticationProvider
requireClientAuthScheme=sasl
jaasLoginRenew=3600000
2.新建zoo_jaas.conf文件,为Zookeeper添加账号认证信息
这个文件你放在哪里随意,只要后面zkEnv配置正确的路径就好了。我是放在/home路径下。zk_server_jaas.conf文件的内容如下
Server {
org.apache.kafka.common.security.plain.PlainLoginModule required
username="kafka"
password="kafkapwd"
user_kafka="kafkapwd";
};
username和paasword是zk集群之间的认证密码。
user_kafka=“kafka"定义了一个用户"kafka”,密码是"kafkapwd"
3.将Kafka相关jar包导入到Zookeeper
Zookeeper的认证机制是使用插件,“org.apache.kafka.common.security.plain.PlainLoginModule”,所以需要导入Kafka相关jar包,kafka-clients相关jar包,在kafka服务下的lib目录中可以找到,根据kafka不同版本,相关jar包版本会有所变化。
所需要jar包如下,在zookeeper下创建目录zk_sasl_lib将jar包放入(目录名与位置可以随便,后续引用指定即可):
kafka-clients-1.1.1.jar
lz4-java-1.4.1.jar
slf4j-api-1.7.25.jar
slf4j-log4j12-1.7.25.jar
snappy-java-1.1.7.1.jar
4.修改zkEnv.sh,主要目的就是将这几个jar包使zookeeper读取到
在$KAFKA_HOME/bin目录下找到zkEnv.sh文件,添加如下代码
注意引用的目录下jar包,与之前创建的zoo_jaas.conf文件
for i in /opt/zookeeper-3.4.14/zk_sasl_lib/*.jar;
do
CLASSPATH="$i:$CLASSPATH"
done
SERVER_JVMFLAGS=" -Djava.security.auth.login.config=/opt/zookeeper-3.4.14/conf/zoo_jaas.conf"
5.重启Zookeeper服务
zkServer.sh restart
查看状态
zkServer.sh status
二、Kafka集群配置SASL
注:所有节点操作相同
1.创建kafka_server_jaas.conf文件,该文件名可以自己修改,为kafka添加认证信息
内容如下(这里的Client与Zookeeper相对应,KafkaServer与后期调用时读取的KafkaClient相对应,是消费生产的账号密码,不要弄混了):
KafkaServer {
org.apache.kafka.common.security.plain.PlainLoginModule required
username="zsh"
password="zshpwd"
user_zsh="zshpwd" ;
};
Client{
org.apache.kafka.common.security.plain.PlainLoginModule required
username="kafka"
password="kafkapwd";
};
先解释KafkaServer,使用user_来定义多个用户,供客户端程序(生产者、消费者程序)认证使用,可以定义多个,后续配置可能还可以根据不同的用户定义ACL,这部分内容超出本文范围。这是目前我对配置的理解。上例我定义了三个用户,一个是admin,一个是producer,一个是consumer,等号后面是对应用户的密码(如user_producer定义了用户名为producer,密码为prod-sec的用户)。再选择一个用户,用于Kafka内部的各个broker之间通信,这里我选择admin用户,对应的密码是admin-sec。
Client配置节则容易理解得多,主要是broker链接到zookeeper,从上文的Zookeeper JAAS文件中选择一个用户,填写用户名和密码即可。
2.在Kafka Server.properties添加、修改如下信息
listeners=SASL_PLAINTEXT://192.168.2.xxx:19092
security.inter.broker.protocol=SASL_PLAINTEXT
sasl.mechanism.inter.broker.protocol=PLAIN
sasl.enabled.mechanisms=PLAIN
allow.everyone.if.no.acl.found=true
authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer
3.Kafka启动脚本中加入配置,读取第一步创建的文件,kafka_server_jaas.conf
修改kafka的kafka-server-start.sh文件,
在如下代码
export KAFKA_HEAP_OPTS="-Xmx1G -Xms1G"
添加
export KAFKA_HEAP_OPTS="-Xmx1G -Xms1G -Djava.security.auth.login.config=/opt/kafka_2.11-1.1.1/config/kafka_server_jaas.conf"
4.启动Kafka服务,查看日志是否正常
kafka-server-start.sh -daemon /opt/kafka_2.11-1.1.1/config/server.properties
三、Java客户端调用认证
这里只对配置进行举例,其他操作不变
客户端文件-kafka_client_jaas.conf
KafkaClient{
org.apache.kafka.common.security.plain.PlainLoginModule required
username="zsh"
password="zshpwd";
};
生产者:
public KafkaProducer() {
System.setProperty("java.security.auth.login.config", "C://Kafkaanquan//kafka_client_jaas.conf");
Properties props = new Properties();
props.put("bootstrap.servers", KafkaParameter.KAFKA_HOST_IP.getValue());
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("max.request.size", 8000000);
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("security.protocol", "SASL_PLAINTEXT");
props.put("sasl.mechanism", "PLAIN");
this.producer = new org.apache.kafka.clients.producer.KafkaProducer<>(props);
}
消费者
public KafkaConsumer(String topic,int count) {
Properties props = new Properties();
props.put("bootstrap.servers", KafkaParameter.KAFKA_HOST_IP.getValue());
//group 代表一个消费组
props.put("group.id", "20190305" );
props.put("zookeeper.session.timeout.ms", "600000");
props.put("zookeeper.sync.time.ms", "200000");
props.put("auto.commit.interval.ms", "100000");
props.put(org.apache.kafka.clients.consumer.ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put(org.apache.kafka.clients.consumer.ConsumerConfig.MAX_POLL_RECORDS_CONFIG, count+"");//设置最大消费数
// MAX_POLL_RECORDS_CONFIG
props.put("security.protocol", "SASL_PLAINTEXT");
props.put("sasl.mechanism", "PLAIN");
props.put("sasl.jaas.config",
"org.apache.kafka.common.security.plain.PlainLoginModule required username=\"zsh\" password=\"zshpwd\";");
//System.setProperty("java.security.auth.login.config", "C:/Kafkaanquan/kafka_client_jaas.conf"); 读取配置文件方式,与读取配置文件二选一
consumer = new org.apache.kafka.clients.consumer.KafkaConsumer(props);
consumer.subscribe(Arrays.asList(topic)); //"collectionInfo"
}
主要配置加入配置是以下三项
props.put("security.protocol", "SASL_PLAINTEXT");
props.put("sasl.mechanism", "PLAIN");
System.setProperty("java.security.auth.login.config", "C:/Kafkaanquan/kafka_server_jaas.conf");
异常说明
1.未读取Kafka认证客户端文件
Exception in thread "main" org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:793)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:644)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:624)
at com.xxx.kafka.KafkaConsumer.<init>(KafkaConsumer.java:32)
at com.xxx.kafka.KafkaConsumer.getInstance(KafkaConsumer.java:38)
at com.xxx.kafka.KafkaConsumer.main(KafkaConsumer.java:44)
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is not set
at org.apache.kafka.common.security.JaasContext.defaultContext(JaasContext.java:133)
at org.apache.kafka.common.security.JaasContext.load(JaasContext.java:98)
at org.apache.kafka.common.security.JaasContext.loadClientContext(JaasContext.java:84)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:119)
at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:65)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:88)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:710)
... 5 more
-
认证文件中的账号密码不正确
Exception in thread “main” org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:793)
at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:644)
at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:624)
at com.xxx.kafka.KafkaConsumer.(KafkaConsumer.java:32)
at com.xxx.kafka.KafkaConsumer.getInstance(KafkaConsumer.java:38)
at com.xxx.kafka.KafkaConsumer.main(KafkaConsumer.java:44)
Caused by: java.lang.IllegalArgumentException: Could not find a ‘KafkaClient’ entry in the JAAS configuration. System property ‘java.security.auth.login.config’ is not set
at org.apache.kafka.common.security.JaasContext.defaultContext(JaasContext.java:133)
at org.apache.kafka.common.security.JaasContext.load(JaasContext.java:98)
at org.apache.kafka.common.security.JaasContext.loadClientContext(JaasContext.java:84)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:119)
at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:65)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:88)
at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:710)
… 5 more