ELK+FileBeat

介绍

   “ELK”是三个开源项目的首字母缩写,这三个项目分别是:Elasticsearch、Logstash 和 Kibana。Elasticsearch 是一个搜索和分析引擎。Logstash 是服务器端数据处理管道,能够同时从多个来源采集数据,转换数据,然后将数据发送到诸如 Elasticsearch 等“存储库”中。Kibana 则可以让用户在 Elasticsearch 中使用图形和图表对数据进行可视化。

 

安装

  • ELK服务器:codeus-log  172.17.202.147
  • nginx代理:codeus-zabbix 172.17.202.149

 

codeus-log需要安装:

  • rsyslog
  • logstash 7.3.0
  • elasticsearch 7.3.0
  • kibana 7.3.0
  • jdk 1.8

 

java服务器需要安装(日志被收集端)

  • filebeat

 

rsyslog用于收集所有主机日志,包括nginx日志

  • rsyslog服务端配置(codeus-log)
  • egrep -v '^#|^$' /etc/rsyslog.conf
$ModLoad imuxsock # provides support for local system logging (e.g. via logger command)
$ModLoad imjournal # provides access to the systemd journal
$ModLoad imklog # reads kernel messages (the same are read from journald)
$ModLoad immark  # provides --MARK-- message capability
$ModLoad imudp
$UDPServerRun 514
$ModLoad imtcp
$InputTCPServerRun 55140
$template RemoteLogs,"/data/log/%FROMHOST-IP%/%PROGRAMNAME%.log" *
*.*  ?RemoteLogs
& ~
$WorkDirectory /var/lib/rsyslog
$ActionFileDefaultTemplate RSYSLOG_TraditionalFileFormat
$IncludeConfig /etc/rsyslog.d/*.conf
$OmitLocalLogging on
$IMJournalStateFile imjournal.state
*.info;mail.none;authpriv.none;cron.none                /var/log/messages
authpriv.*                                              /var/log/secure
mail.*                                                  -/var/log/maillog
cron.*                                                  /var/log/cron
*.emerg                                                 :omusrmsg:*
uucp,news.crit                                          /var/log/spooler
local7.*                                                /var/log/boot.log
  • rsyslog客户端配置
  • egrep -v '^#|^$' /etc/rsyslog.conf
$ModLoad imuxsock # provides support for local system logging (e.g. via logger command)
$ModLoad imklog   # provides kernel logging support (previously done by rklogd)
$ActionFileDefaultTemplate RSYSLOG_TraditionalFileFormat
$IncludeConfig /etc/rsyslog.d/*.conf
*.info;mail.none;authpriv.none;cron.none                /var/log/messages
authpriv.*                                              /var/log/secure
mail.*                                                  -/var/log/maillog
cron.*                                                  /var/log/cron
*.emerg                                                 *
uucp,news.crit                                          /var/log/spooler
local7.*                                                /var/log/boot.log
module(load="imfile" PollingInterval="5")
$InputFileName /tmp/logs/app.log
$InputFileTag nova-info:
$InputFileStateFile state-nova-info
$InputRunFileMonitor
*.* @@172.17.202.147:55140

jdk二进制安装

  • tar -zxvf jdk-8u211-linux-x64.tar.gz -C /usr/local/
  • mv /usr/local/jdk1.8.0_211/ /usr/local/jdk
  • vim /etc/profile
JAVA_HOME=/usr/local/jdk
 
PATH=$JAVA_HOME/bin:$PATH
 
export JAVA_HOME PATH

logstash通过rpm方式安装

  • rpm -ivh logstash-7.3.0.rpm
  • 安装插件 /data/logstash/bin/logstash-plugin install logstash-filter-multiline
  • vim /usr/share/logstash/app.conf
input {
    beats {
        port => "5044"
    }
}
 
filter {
    mutate {
        rename => { "[host][name]" => "host" }
    }
    multiline {
         pattern => "^201.*-.*-.*"
         negate => true
         what => "previous"
    }
}
 
filter {
    mutate {
        rename => { "[host][name]" => "host" }
    }
    if [type] == "codeus-app02-accesslog" {
        grok {
            match => ["message","%{JAVA_DATE:date} %{JAVA_TIME:time} %{JAVA_PORT:port} %{JAVA_LOGLEVEL:loglevel} %{JAVA_LOGTYPE:logtype} %{JAVA_NUMBER:number} %{JAVA_NULL:null} %{JAVA_DATE_TIME:date_time} %{JAVA_USERID:userID} %{JAVA_USERIP:userIP} %{JAVA_BREXPO:brexpo} %{JAVA_CONNECT:connect}"]
        }
    }
}
 
filter {
    mutate {
        rename => { "[host][name]" => "host" }
    }
    if [type] == "codeus-app02-website-accesslog" {
        grok {
            match => ["message","%{JAVA_DATE:date} %{JAVA_TIME:time} %{JAVA_PORT:port} %{JAVA_LOGLEVEL:loglevel} %{JAVA_LOGTYPE:logtype} %{JAVA_NUMBER:number} %{JAVA_NULL:null} %{JAVA_DATE_TIME:date_time} %{JAVA_USERID:userID} %{JAVA_USERIP:userIP} %{JAVA_BREXPO:brexpo} %{JAVA_CONNECT:connect}"]
        }
    }
}
 
filter {
    mutate {
        rename => { "[host][name]" => "host" }
    }
    if [type] == "codeus-app01-accesslog" {
        grok {
            match => ["message","%{JAVA_DATE:date} %{JAVA_TIME:time} %{JAVA_PORT:port} %{JAVA_LOGLEVEL:loglevel} %{JAVA_LOGTYPE:logtype} %{JAVA_NUMBER:number} %{JAVA_NULL:null} %{JAVA_DATE_TIME:date_time} %{JAVA_USERID:userID} %{JAVA_USERIP:userIP} %{JAVA_BREXPO:brexpo} %{JAVA_CONNECT:connect}"]
        }
    }
}
 
filter {
    mutate {
        rename => { "[host][name]" => "host" }
    }
    if [type] == "codeus-app01-website-accesslog" {
        grok {
            match => ["message","%{JAVA_DATE:date} %{JAVA_TIME:time} %{JAVA_PORT:port} %{JAVA_LOGLEVEL:loglevel} %{JAVA_LOGTYPE:logtype} %{JAVA_NUMBER:number} %{JAVA_NULL:null} %{JAVA_DATE_TIME:date_time} %{JAVA_USERID:userID} %{JAVA_USERIP:userIP} %{JAVA_BREXPO:brexpo} %{JAVA_CONNECT:connect}"]
        }
    }
}
 
output {
#    stdout { codec => rubydebug }
    elasticsearch {
        hosts => "127.0.0.1"
        index => "logstash-%{+YYYY.MM.dd}"
   }
}
  • vim /usr/share/logstash/nginx-log.conf
input {
     file {
        path => ["/data/log/172.17.222.246/nginx-access.log"]
        type => "app02-nginx-access_log"
        start_position => "beginning"
     }
}
 
input {
     file {
        path => ["/data/log/172.17.202.146/nginx-access.log"]
        type => "app01-nginx-access_log"
        start_position => "beginning"
     }
}
 
input {
     file {
        path => ["/data/log/172.17.222.246/nginx-error.log"]
        type => "app02-nginx-error_log"
        start_position => "beginning"
     }
}
 
input {
     file {
        path => ["/data/log/172.17.202.146/nginx-error.log"]
        type => "app01-nginx-error_log"
        start_position => "beginning"
     }
}
 
filter {
    if [type] == "app02-nginx-access_log" {
        grok {
             match => ["message","%{DATA:rsyslog_date_time} %{DATA:nginx_system} %{DATA:nginx_type} %{IP:remote_add} %{DATA:remote_user} %{DATA:null} %{DATA:time_local} %{QS:request} %{DATA:status} %{DATA:body_bytes_sent} %{DATA:http_referer} %{QS:http_user_agent} %{DATA:http_x_forwarded_for} %{NUMBER:request_time}"]
        }
    }
}
 
filter {
    if [type] == "app01-nginx-access_log" {
        grok {
             match => ["message","%{DATA:rsyslog_date_time} %{DATA:nginx_system} %{DATA:nginx_type} %{IP:remote_add} %{DATA:remote_user} %{DATA:null} %{DATA:time_local} %{QS:request} %{DATA:status} %{DATA:body_bytes_sent} %{DATA:http_referer} %{QS:http_user_agent} %{DATA:http_x_forwarded_for} %{NUMBER:request_time}"]
        }
    }
}
 
output {
    elasticsearch {
        hosts => "127.0.0.1"
        index => "logstash-%{+YYYY.MM.dd}"
   }
}

elasticsearch通过docker方式安装  docker run -p 9200:9200 -e "http.host=0.0.0.0" -e "transport.host=127.0.0.1" --name elastic -d elasticsearch:7.3.0

docker将存储目录调整到/data/docker_dir

  • vim /usr/lib/systemd/system/docker.service

ExecStart=/usr/bin/dockerd -H fd:// --containerd=/run/containerd/containerd.sock --graph /data/docker_dir
  • systemctl disable docker
  • systemctl daemon-reload
  • systemctl start docker

kibana二进制包方式安装,使用普通用户work

  • cd /data
  • tar -zxvf kibana-7.3.0-linux-x86_64.tar.gz
  • mv kibana-7.3.0-linux-x86_64 kibana
  • chown -R work:work kibana
  • egrep -v '^#|^$' /data/kibana/config/kibana.yml  配置文件修改为下
server.host: "172.17.202.147"
server.basePath: "/kibana"
elasticsearch.hosts: ["http://localhost:9200"]
xpack.reporting.encryptionKey: "a_random_string"
xpack.security.encryptionKey: "something_at_least_32_characters"

java服务器filebeat

wget https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-5.6.5-linux-x86_64.tar.gz

tar -zxvf filebeat-5.6.5-linux-x86_64.tar.gz

mv filebeat-5.6.5-linux-x86_64 /usr/local/filebeat

cd /usr/local/filebeat/

mv filebeat.yml filebeat.yml.bak

vim filebeat.yml

filebeat:
  prospectors:
  -
    document_type: codeus-app02-applog
    paths:
      - /tmp/logs/app.log
    input_type: log
 
  -
    document_type: codeus-app02-accesslog
    paths:
      - /tmp/logs/access.log
    input_type: log
 
  -
    document_type: codeus-app02-website-applog
    paths:
      - /tmp/logs/website_app.log
    input_type: log
 
  -
    document_type: codeus-app02-website-accesslog
    paths:
      - /tmp/logs/website_access.log
    input_type: log
 
output.logstash:
  hosts: ["172.17.202.147:5044"]

启动

  • logstash
cd /usr/share/logstash
 
nohup bin/logstash -f nginx-log.conf &
 
nohup bin/logstash -f app.conf --path.data=/usr/share/logstash/app &    多实例情况下第二个需要指定工作路径
  • elasticsearch
docker run -p 9200:9200 -e "http.host=0.0.0.0" -e "transport.host=127.0.0.1" --name elastic -d elasticsearch:7.3.0
  • kibana
su - work
 
cd /data/kibana
 
./bin/kibana &
  • rsyslog
service rsyslog restart
 
systemctl restart rsyslog
  • filebeat
cd /usr/local/filebeat
 
./filebeat &

nginx配置

cat /etc/nginx/conf.d/kibana.con

server {
    listen 35601;
 
    location / {
        auth_basic " Basic Authentication ";
        auth_basic_user_file "/etc/nginx/.htpasswd";
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
 
        proxy_pass  http://172.17.202.147:5601/;
        rewrite ^/kibana/(.*)$ /$1 break;
    }
}

filebeat安装及配置


#filebeat.inputs:
filebeat:
  prospectors:
  -
    document_type: cloudcompiler-app03-applog
    paths:
      - /tmp/logs/cloudcompiler-log/app.log
    input_type: log
 
output.logstash:
  hosts: ["172.17.202.147:5044"]
  • ./filebeat &

filebeat使用二进制方式安装发现总是意外退出,后改成rpm方式安装就不会出现这种情况,建议使用rpm方式

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值