下载安装参考官网,这里主要记录一下Logstash与kafka结合的配置。
需求:将log日志包含api: 的内容放入kafka。
2017-07-13 18:01:31.619 INFO 15096 — [nio-9001-exec-1] c.z.a.t.a.s.impl.XXXServiceImpl : EventReportService_hasLog_api: {“uid”:123,”statisticalTime”:”2017-07-13 18:01:01 +0800”}
1.配置正则
在Logstash目录新建 patterns 目录,并创建正则文件
mkdir patterns;cd patterns;
vim API_TK
api([\\s\\S]*?)api
2.编辑logstash配置文件 syj.conf
input {
file {
path => "/home/syj/log/dd/sdf.txt"
discover_interval => 15
start_position => "beginning"
}
}
filter {
grok {
patterns_dir => ["./patterns"]
match => { "message" => "%{API_T:api}: %{GREEDYDATA:message}" }
overwrite => [ "message" ]
}
if "_grokparsefailure" in [tags] { drop {} }
}
output {
if "_grokparsefailure" not in [tags]{
kafka {
codec => plain {
format => "%{message}"
}
topic_id => "log"
bootstrap_servers =>"127.16.10.21:9093,127.16.10.21:9094,127.16.10.21:9092"
acks => "0"
compression_type => "snappy"
}
}
}
3.运行测试
bin/logstash -f conf/syj.conf &
echo 2017-07-13 18:01:31.619 INFO 15096 --- [nio-9001-exec-1] c.z.a.t.a.s.impl.XXXServiceImpl : EventReportService_hasLog_api: {"uid":123,"statisticalTime":"2017-07-13 18:01:01 +0800"} >>/home/syj/log/dd/sdf.txt
编写了一个kafka客户端,接收到数据,ok。
logstash一些优化配置,自行google/baidu