一、背景
使用ES收集Nginx日志,可以借助Kibana可视化视图分析日志。日志格式如下:
[2020-12-10 11:27:01] [-] [info] [application] [-] [-] [api_run_time] [接口响应时间] [{"api":"/index/test","took_s":"0.01","memory_used":"8.05"}]
实现思路
- filebeat实时采集日志,并发送至LogStash服务器。
- Logstash接收到日志进行信息过滤和提取。
- Logstash将提取的日志信息发送至ES。
- 配置Kiban索引,查看ES日志。
- 添加Kiban可视化视图,进行日志分析。
二、日志服务器添加FileBeat配置
- 编辑filebeat配置文件,
vim /usr/local/filebeat/filebeat.yml
- 在-type:log 处增加一个规则
- type: log
enabled: true
paths:
- /home/logs/log_*.log #填写日志路径 支持正则匹配
tags: ["time_log"]
fields_under_root: true
fields:
serverip: ${SERVERIP} #增加额外字段 收集源的服务器IP
multiline: #处理多行日志
pattern: '^(\[\d{4})\-(\d{2})\-(\d{2})' #日志匹配的正则
negate: true #将不是以[年月日]开头的行作为合并行
match: after
max_lines: 1000
timeout: 30s
- 重启Filebeat
./filebeat -e -c filebeat.yml
三、配置LogStash
- 修改配置文件
vim /usr/local/logstash/config/elk_full_patten.conf
- 在filter{}中增加一个匹配规则:
if "time_log" in [tags] {
grok{
match => {
"message" => ["\[(?<[applog][log_time]>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND})\] \[%{QS:[applog][method]}\] \[%{QS:[applog][level]}\]\s+\[%{QS:[applog][category]}\] \[%{QS:[applog][url]}\] \[%{QS:[applog][params]}\] \[%{QS:[applog][logKey]}\] \[%{QS:[applog][logMsg]}\] \[%{QS:[applog][logData]}\]"]
}
remove_field => ["message"]
}
mutate{
remove_field => ["beat","meta","host","agent"]
}
mutate {
add_field => { "json_log" => "%{applog}" }
}
json {
source => "json_log"
remove_field => ["json_log","client","log_time","method","level","category","url","logKey","logMsg"]
}
mutate {
add_field => { "api_time" => "%{logData}" }
}
json {
source => "api_time"
remove_field => ["logData",'api_time']
}
}
- 在output{}中增加ES配置
if "time_log" in [tags] {
if "api_run_time"== [applog][logKey]{
#stdout{codec=>rubydebug} #调试信息
elasticsearch {
hosts => ["127.0.0.3:9200"] #ES节点
index => "log_api_time-%{index_date}" #按日期创建索引
}
}
}
- 增加stdout{codec=>rubydebug}可查看LogStash的调试信息,日志收集的数据展示如下:
{
"tags" => [
[0] "time_log",
[1] "beats_input_codec_plain_applied"
],
"ecs" => {
"version" => "1.5.0"
},
"applog" => {
"method" => "-",
"logData" => "{\"api\":\"/index/test\",\"took_s\":\"1.62\",\"memory_used_M\":\"10.34\"}",
"logKey" => "api_run_time"
"client" => "-",
"category" => "application",
"log_time" => "2020-12-08 17:56:01",
"level" => "info"
"logMsg" => "接口响应时间",
"url" => "-"
},
"serverip" => "192.168.1.1",
"input" => {
"type" => "log"
},
"log" => {
"file" => {
"path" => "/home/logs/log_20201208.log"
},
"offset" => 2501110
},
"index_date" => "2020.12.08",
"@version" => "1",
"@timestamp" => 2020-12-08T09:57:04.636Z
}
四、Kibana配置视图
-
进入Management页面,点击索引模式,创建一个索引,名字为Logstash中配置的名字,如:log_api_time-*
-
创建之后可在Discover页面查看索引记录,在右侧可以选择展示的字段
-
进入Visualise页面,新增一个视图,在右侧存储桶中选择一个聚合字段为Date Histogram,字段选择timestamp,可生成以时间为单位的X轴。
-
新增一个Y轴,聚合方式选择最大值,存储桶选择时间范围,指标选择要统计最大值的字段took_s。Y轴会取单位时间段内的该字段的最大值。可以增加一多个Y轴,如:增加一个最大值和平均值
-
在指标和轴处设置Y轴展示形式为面积图,点击更新,好看的面积图就出来了。设置完成后需要点一下左上角的【保存】。
-
进入DashBoard页面新增一个仪表盘,点击左上角的添加,选中刚才设置的视图,排好顺序就完成了。看一下效果图:
五、常见问题
- Logstash解析日志时,我们要的时间字段存放在applog下的LogData是Json中,视图无法直接获取到这个字段,就得想办法将LogData的数据解析出来。
解决方法:
1)使用mutate 增加一个字段json_log,将applog据提取到上一层,同时将applog中不需要的其他字段页提取出来了,所以需要在删除掉没用的字段。
2)使用json 将json_log解析出来,在使用同样的方式将logData内的字段提取出。
mutate {
add_field => { "json_log" => "%{applog}" }
}
json {
source => "json_log"
remove_field => ["json_log","channel","token","client","log_time","method","level","category","url","params","logKey","logMsg","result","ip","userId","ua","productId"]
}
mutate {
add_field => { "api_time" => "%{logData}" }
}
json {
source => "api_time"
remove_field => ["logData",'api_time']
}
- took_s提取出之后,添加Y轴指标时,只有timestamp和offset并没有发现took_s字段 :
解决方法:
1)发现Json字段toos_s是字符串类型,将took_s转为int即可。
2)或者直接在记录日志时记录直接存储为float类型
mutate{
convert => ["took_s" , "float"]
convert => ["memory_used_M", "float"]
}
- 结合1、2优化了一下logData提取,修改日志记录字段为float,直接使用Json将LogData信息提到一个新的字段,节省很多步骤. 视图中选择指标时,直接选择api_time.took_s字段即可
json {
source => "[applog][logData]"
target =>"api_time"
}
六、拓展 screen操作
- 创建新屏幕并执行任务 screen cd ~
- 查看所有屏幕 screen -ls
- 进入屏幕 screen -r 面板名字
- 强制进入屏幕 screen -D -r 面板名字
- 退出屏幕 Ctrl+A+D
参考:
https://github.com/logstash-plugins/logstash-patterns-core/blob/master/patterns/grok-patterns
https://www.cntofu.com/book/52/filter/mutate.md
https://www.jianshu.com/p/de06284e1484