日志监控:即将日志打印到指定目录下,然后用filebeat将本地日志发送到目标kafka服务器,然后做日志分析。
1.在服务器安装filebeat(如filebeat-7.0.1-x86_64.rpm);
2.修改/etc/filebeat/filebeat.yml文件,filebeat.yml示例:
#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log
enabled: true
paths:
- /home/app/logs/data.log # 此处改为你的应用日志文件的地址
multiline.pattern: '^20'
multiline.negate: true
multiline.match: after
tail_files: true
#============================= Filebeat modules ===============================
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 3
#================================ Outputs =====================================
#-------------------------- kafka output ------------------------------
output.kafka:
enabled: true
hosts: ["10.3.7.34:9092","10.3.7.35:9092","10.3.7.36:9092"] # 此处改为kafka的服务器ip
topic: "data"# 此处改为kafka 对应的topic
#================================ Procesors =====================================
processors:
- drop_fields:
fields: ["@timestamp","beat","input","offset","source","@metadata","host","prospector"]