1.logstash整合kerberos kafka
input {
beats{
port => 5044
}
}
output {
kafka {
bootstrap_servers => "datanode1.cloudera.yijiupidev.com:9092"
security_protocol => "SASL_PLAINTEXT"
sasl_kerberos_service_name => "kafka"
topic_id => "flink_bi_report"
jaas_path => "/etc/logstash/conf.d/jaas.conf" #不同环境可能需要更改
kerberos_config => "/etc/krb5.conf" #不同环境可能需要更改
}
}
jaas_path:
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/home/keytab/kafka.keytab" #不同环境可能需要更改
principal="XXXX"; #不同环境可能需要更改
};
2.logstash整合kerberos HDFS
input {
#beats{
# port => 5044
#}
file {
id => "kerberos_hdfs_test01"
path => "/test01.txt"
}
}
output {
stdout {}
webhdfs {
host => "XXX" # (required) 必须为namenode
standby_host => "XXX" # 备用namenode
port => 14000 # (optional, default: 50070)
standby_port => 14000
path => "XXX" # (required)
user => "hdfs" # (required)
codec => line {
format => "%{message}"
}
use_kerberos_auth => "true"
kerberos_keytab => "/home/keytab/hdfs.keytab" #不同环境可能需要更改
}
}
步骤
-
需要下载gssapi 对应的问题:java.lang.IllegalStateException: Logstash stopped processing because of an error: (LoadError) no such file to load -- gssapi
解决办法:/usr/share/logstash/bin/logstash-plugin install --no-verify gssapi - 需要修改lib/gssapi/simple.rb和lib/gssapi/exceptions.rb 对应异常:read_uint32' for #<FFI::MemoryPointer address=0x7fee6404c7a0 size=4> Did you mean? read_uint read_int read_array_of_uint32 read_array_of_int32 read_pointer read_ulong read_string read_ushort read_array_of_uint64 read_array_of_uint16 get_uint32) [ERROR] 2019-06-13 16:21:38.110 [[main]-pipeline-manager] pipeline - Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::OutputDelegator:0x725df834>"
解决办法:使用查找命令 find /usr/share/logstash/ -name "*exceptions*" 两个文件同级目录
修改内容:
https://github.com/gpaggi/gssapi/commit/69b14cce1966fdae713d274ca19339237eca72b9 - 需要为hdfs.keytab赋权
chmod 777 hdfs.keytab - 初始化hdfs.keytab 对应异常:pipeline - Pipeline aborted due to e_S_COMPLETE: Unspecified GSS failure. Minor code may provide more information
No Kerberos credentials available (default cache: FILE:/tmp/krb5cc_0)
kinit -kt /home/keytab/hdfs.keytab hdfs