参考:https://blog.51cto.com/zero01/2079879
6.0 和 6.8 略有不同,本文主要写不同的地方。
机子: 192.168.5.166, elk + filebeat 全部安装到一台机子上。提前安装jdk1.8 并关闭防火墙。
1. 安装es
安装过程见引用文章。
修改配置(仅列不同的地方):
[root@master-node ~]# vi /etc/elasticsearch/elasticsearch.yml # 增加或更改以下内容
node.data: true # 表示这不是数据节点
discovery.zen.ping.unicast.hosts: ["192.168.5.166"] # 配置自动发现
注意 es本身日志目录的属主(非root)
2.安装kibana
安装过程见参考文章,kibana 页面设置:
management -> index patterns -> create index pattern -> log-info-*
management -> index patterns -> create index pattern -> log-warn-*
management -> index patterns -> create index pattern -> log-system-*
logs -> create configure source (name: 随便,Log indices :filebeat-*,kibana_sample_data_logs*, log-info-*, log-warn-*, log-system-*)
3. 安装logstash(略)
4. 安装nginx-1.16.0
本人在安装centos 7上安装 nginx遇到依赖的问题,通过修改源来安装:
[root@master-node ~]# vi /etc/yum.repos.d/nginx.repo
[nginx]
name=nginx repo
baseurl=http://nginx.org/packages/centos/7/$basearch/
gpgcheck=0
enabled=1
修改nginx配置(仅不同):
[root@master-node ~]# vi /etc/nginx/conf.d/default.conf
# 在server前
log_format main2 '$http_host $remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$upstream_addr" $request_time';
server {
location / {
proxy_pass http://192.168.5.166:5601;
}
access_log /tmp/elk_access.log main2;
}
5. 安装filebeat
修改配置:
[root@master-node ~]# vi /etc/filebeat/filebeat.yml
#=========================== Filebeat inputs =============================
filebeat.inputs:
# 项目info日志
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- infopath
fields:
log_type: info
fields_under_root: true
# 项目warn日志
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- warnpath
#- c:\programdata\elasticsearch\logs\*
fields:
log_type: warn
fields_under_root: true
# 系统warn日志
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- systempath
fields:
log_type: system
fields_under_root: true
multiline.pattern: '((=(ERROR|WARNING)+\sREPORT====\s\d{1,2}-)|(\{\"\d{4}-\d{1,2}-\d{1,2}\s\d{1,2}:\d{1,2}:\d{1,2}))+'
multiline.negate: true
multiline.match: after
#==================== Elasticsearch template setting ==========================
setup.template.name: "filebeat-6.8"
setup.template.pattern: "filebeat-6.8-*"
etup.dashboards.index: "filebeat-6.8-*"
#============================== Kibana =====================================
setup.kibana:
host: "192.168.5.166:5601"
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
hosts: ["192.168.5.166:9200"]
indices:
- index: "log-info-%{+yyyy.MM.dd}"
when.contains:
log_type: info
- index: "log-warn-%{+yyyy.MM.dd}"
when.contains:
log_type: warn
- index: "log-system-%{+yyyy.MM.dd}"
when.contains:
log_type: system
6. 守护进程
参考:https://blog.csdn.net/wu2700222/article/details/85044117