发现是ZK问题,重启ZK后,再同步kafka中数据,logstash日志就正常了。
ZK配置如下:
docker run --name myzk -p 2181:2181 -d jplock/zookeeper
Kafka配置如下:
docker run -d --name mykafka --publish 9092:9092 --link myzk --env KAFKA_ZOOKEEPER_CONNECT=myzk:2181 --env KAFKA_ADVERTISED_HOST_NAME=172.17.0.10 --env \
KAFKA_ADVERTISED_PORT=9092 --env KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://xx.xx.xx.xx:9092 --volume /etc/localtime:/etc/localtime wurstmeister/kafka:latest
注意:KAFKA_ADVERTISED_HOST_NAME为云服务器内网IP、KAFKA_ADVERTISED_LISTENERS为外网IP(公网IP)
Logstash的配置如下:
input {
kafka{
client_id => 'logstash'
group_id => 'logstash'
codec => 'json'
auto_offset_reset => 'earliest'
bootstrap_servers => 'xx.xx.xx.xx:9092'
topics_pattern => 'TopicMessageLog'
max_partition_fetch_bytes => '5242880'
metadata_max_age_ms => 1000
}
}
filter{
if "_jsonparsefailure" in [tags]{
drop {}
}
}
output {
elasticsearch {
hosts => ["http://xx.xx.xx.xx:9201/"]
index => "xx-msg-%{+YYYY-MM}"
}
stdout {
codec => rubydebug
}
}