1、下载Filebeat
2、解压并执行以下命令安装
cd D:\ELK\filebeat-7.15.0
PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1
Status Name DisplayName
------ ---- -----------
Stopped filebeat filebeat
3、在filebeat.yml配置kibana和elasticsearch
# =================================== Kibana ===================================
# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:
# Kibana Host
# Scheme and port can be left out and will be set to the default (http and 5601)
# In case you specify and additional path, the scheme is required: http://localhost:5601/path
# IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
# 有IP就配IP,域名的就配域名
host: "kibana-host:5601"
# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["es-node-1-host:9200","es-node-2-host:9200"]
# Protocol - either `http` (default) or `https`.
#protocol: "https"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
username: "elastic"
password: "123456"
4、收集日志
收集elasticsearch日志
# 查看启用的模块
filebeat.exe modules list
# 启用elasticsearch
filebeat.exe modules enable elasticsearch
5、配置elasticsearch的日志路径
位置:filebeat-7.15.0\modules.d\elasticsearch.yml
# Module: elasticsearch
# Docs: https://www.elastic.co/guide/en/beats/filebeat/7.x/filebeat-module-elasticsearch.html
- module: elasticsearch
# Server log
server:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ["D:\\ELK\\elasticsearch-7.15.0\\logs\\moss-es.log"]
gc:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ["D:\\ELK\\elasticsearch-7.15.0\\logs\\gc.log*"]
audit:
enabled: false
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
#var.paths:
slowlog:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ["D:\\ELK\\elasticsearch-7.15.0\\logs\\moss-es_index_indexing_slowlog.log"]
deprecation:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ["D:\\ELK\\elasticsearch-7.15.0\\logs\\moss-es_deprecation.log"]
6、启动Filebeat
日志默认位置:C:\ProgramData\filebeat\logs
# 启动
net start filebeat
# 停止
net stop filebeat
7、登录kibana查看
点【Discover】,选filebeat-*
就可以看到收集的日志
filebeat收集应用日志
官方教程:https://www.elastic.co/guide/en/beats/filebeat/master/filebeat-input-log.html
1、配置应用日志
修改filebeat.yml中的filebeat.inputs,可以配置多个
# ============================== Filebeat inputs ===============================
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- E:\data\logs\moss-oauth\*.log
#- c:\programdata\elasticsearch\logs\*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ['^DBG']
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching any regular expression from the list.
#include_lines: ['^ERR', '^WARN']
# Exclude files. A list of regular expressions to match. Filebeat drops the files that
# are matching any regular expression from the list. By default, no files are dropped.
#exclude_files: ['.gz$']
# Optional additional fields. These fields can be freely picked
# to add additional information to the crawled log files for filtering
fields:
index: 'moss-oauth'
# level: debug
# review: 1
### Multiline options
# Multiline can be used for log messages spanning multiple lines. This is common
# for Java Stack Traces or C-Line Continuation
# The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
#multiline.pattern: ^\[
# Defines if the pattern set under pattern should be negated or not. Default is false.
#multiline.negate: false
# Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
# that was (not) matched before or after or as long as a pattern is not matched based on negate.
# Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
#multiline.match: after
multiline.pattern: '^\d{4}\-\d{2}\-\d{2}\s\d{2}:\d{2}:\d{2}' #匹配的正则 不是以2019-09-08 12:23:23 格式开头的将合并到上一行
multiline.negate: true #多行匹配模式后配置的模式是否取反,默认false
multiline.match: after #定义多行内容被添加到模式匹配行之后还是之前,默认无,可以被设置为after或者before
# ======================= Elasticsearch template setting =======================
#修改索引名字,要关闭索引生命周期管理
setup.ilm.enabled: false
setup.template.name: "moss-oauth"
setup.template.pattern: "moss-oauth-*"
# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["es-node-1-host:9200","es-node-2-host:9200"]
# Protocol - either `http` (default) or `https`.
#protocol: "https"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
username: "elastic"
password: "123456"
# 此处可以根据不同的日志生成不同的索引
indices:
- index: "moss-oauth-%{+yyyy.MM.dd}"
when.contains:
fields:
index: "moss-oauth"
2、启动filebeat
# 启动
net start filebeat
# 停止
net stop filebeat
3、查看索引
4、打开kibana,点[Discover]查看
如果看不到就自己配置【索引模式】,如下图