上篇介绍了通过filebeat将日志直接写入到elasticsearch中去。但大多时候我们需要对日志的不同字段拆分后写入elasticsearch中,方便查询和统计,这就需要用到logstash了。
安装logstash(centos)
这里介绍通过yum方式安装:
- 执行
sudo rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch
- 在
/etc/yum.repos.d/
目录下添加elastic.repo
文件,内容如下:[logstash-7.x] name=Elastic repository for 7.x packages baseurl=https://artifacts.elastic.co/packages/7.x/yum gpgcheck=1 gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch enabled=1 autorefresh=1 type=rpm-md
- 执行
sudo yum install logstash
配置logstash
- 在
/etc/logstash/conf.d
下创建规则文件xxx.conf
,推荐一个grok规则验证site - 在
xxx.conf
写入:input { beats { port => 5044 } } filter { grok { match => { "message" => "%{TIMESTAMP_ISO8601:logtime} \[%{NOTSPACE:threadname}\] %{LOGLEVEL:loglevel} %{DATA:javamethod} - %{JAVALOGMESSAGE:logcontent}" } } date { match => ["logtime", "ISO8601"] target => "@timestamp" } mutate { add_field => { "[@metadata][target_index]" => "fb-%{filetype}-%{+YYYY.MM.dd}" } remove_field => ["logtime", "message", "tags"] } } output { elasticsearch { hosts => ["127.0.0.1:9200"] index => "%{[@metadata][target_index]}" user => "elastic" password => "search" } }
grok对message的解析对应的日志格式如下:
%d [%t] %-5level %logger{36}.%M\(%file:%line\) - %msg%n
- 重启logstash服务
service logstash restart