基于ELK搭建网站监控可视化

搭建ELK网站浏览可视化监控前提安装 ElasticSearch 安装 kibana 、 logstash 安装

ELK 网站流量可视化监控

ELK 网站流量可视化监控

在这里插入图片描述

通过上图我们可以看到,ELK 是由三个 Elastic 的产品组合而成,分别是 ElasticSearch、Logstash 和 Kibana。三者之间的部署关系如下图所示:

在这里插入图片描述

Logstash 就好比是挖矿工,将原料采集回来存放到 ElasticSearch 这个仓库中,Kibana 再将存放在 ElasticSearch 中的原料进行加工包装成产品,输出到 web 界面。基本工作原理如下图所示

在这里插入图片描述

工具

Grok Debugger 国内镜像 是一个常用的对日志进行结构化的一个工具,可以通过在线工具进行调试:
http://grokdebug.herokuapp.com
grok 正则匹配工具(科学上网)

grok国内镜像

网站流量可视化监控配置

部署 Tomcat 应用

部署 tomcat 服务应用,将 access 日志输出到 /guaoran/elk/webapps/logs/bigdata/目录下

修改 tomcat 日志输出路径 conf/server.xml

<Host>
    <Valve className="org.apache.catalina.valves.AccessLogValve" directory="/guaoran/elk/webapps/logs/bigdata"
           prefix="localhost_access_log" suffix=".log"
           pattern="%h %l %u %t &quot;%r&quot; %s %b" />
    <Context path="/" docBase="/guaoran/elk/webapps/bigdata" reloadable="false"></Context>
</Host>

配置 logstash 输入、解析、输出日志到 elasticSearch

tomcat access

创建正则配置目录 和 添加正则文件

/guaoran/elk/logstash-6.5.1/config
mkdir patterns
vi /guaoran/elk/logstash-6.5.1/config/patterns/tomcat

日志内容 /guaoran/elk/webapps/logs/bigdata/localhost_access_log.2019-05-09.log

10.30.185.250 - - [11/Apr/2019:17:59:47 +0800] "GET / HTTP/1.0" 200 11064
10.30.185.250 - - [11/Apr/2019:17:59:47 +0800] "GET /css/base.css HTTP/1.0" 200 996
10.30.185.250 - - [11/Apr/2019:17:59:47 +0800] "GET /css/common.css HTTP/1.0" 200 10564
10.30.185.250 - - [11/Apr/2019:17:59:47 +0800] "GET /css/index.css HTTP/1.0" 200 23328

正则内容 /guaoran/elk/logstash-6.5.1/config/patterns/tomcat

TOMCATACCESS %{IPORHOST:clientip} (?:-|%{USER:ident}) (?:-|%{USER:auth}) \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|-)\" %{NUMBER:response} (?:-|%{NUMBER:bytes})

启动配置 /guaoran/elk/logstash-6.5.1/config/tomcat.conf

input {
    file {
        path => ["/guaoran/elk/webapps/logs/bigdata/localhost_access_log.*.log"]
		type => "bigdata_tomcat"
		#sincedb_path => "/dev/null"
		start_position => "beginning"
    }
}

filter {
	if [type] == "bigdata_tomcat" {
		grok {
			patterns_dir => "/guaoran/elk/logstash-6.5.1/config/patterns/"
			match => {
				"message" => "%{TOMCATACCESS}"
			}
		}
		date {
        	match => ["timestamp","dd/MMM/YYYY:HH:mm:ss Z"]
        }
    }
}

output {
	stdout {
		codec => rubydebug
	}
	elasticsearch {
		hosts => ["http://192.168.192.129:9200"]
		index => "logstash-%{type}-%{+YYYY.MM.dd}"
		document_type => "%{type}"
		sniffing => true
	}
}
nginx access
格式一

日志格式

log_format main '$host $status [$time_local] $remote_addr [$time_local] $request_uri '
                    '"$http_referer" "$http_user_agent" "$http_x_forwarded_for" '
                    '$bytes_sent $request_time $sent_http_x_cache_hit';

日志内容 /guaoran/elk/webapps/logs/agdata/

www.guaoran.cn 200 [08/May/2019:08:00:01 +0800] 127.0.0.1 [08/May/2019:08:00:01 +0800] /datarecommend/getrecommand-465212.do "-" "Apache-HttpClient/4.5.3 (Java/1.8.0_144)" "-" 811 0.011 -
www.guaoran.cn 206 [09/May/2019:03:34:18 +0800] 127.0.0.1 [09/May/2019:03:34:18 +0800] /js/jquery-1.9.1.js "http://www.guaoran.cn/client/helpInfo.html" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36" "-" 23869 0.003 -

启动配置 /guaoran/elk/logstash-6.5.1/config/agdata.conf

input {
    file {
        path => ["/guaoran/elk/webapps/logs/agdata/*"]
		type => "agdata_nginx"
		#sincedb_path => "/dev/null"
		start_position => "beginning"
    }
}

filter {
	grok {
		match => {
			"message" => "(?:-|%{URIHOST:HOSTNAME}) %{NOTSPACE:response} \[%{HTTPDATE:timestamp2}\] %{IPORHOST:clientip} \[%{HTTPDATE:timestamp}\] %{NOTSPACE:request} (?:-|%{QS:referrer}) %{QS:agent} %{QS:xforwardedfor} (?:-|%{NUMBER:bytes}) %{BASE10NUM:response_time} "
		}
	}
	date {
		match => ["timestamp","dd/MMM/YYYY:HH:mm:ss Z"]
	}
	mutate{
		gsub => ["HOSTNAME",'"','']
	}
	mutate{
		convert => {"bytes"=>"float"}
	}
	geoip{
		source => "clientip"
	}
	useragent{
		source => "agent"
		target => "useragent"
	}

	if [referrer] =~ /^"http/ {
		grok{
			match => {
				"referrer" => '%{URIPROTO}://%{URIHOST:referrer_host}'
			}
		}
	}
	mutate{
		gsub => ["referrer",'"','']
	}
	mutate{remove_field=>["message"]}
}

output {
	#stdout{
	#	codec => rubydebug{
			metadata => true
	#	}
	#}
	elasticsearch {
		hosts => ["http://192.168.192.129:9200"]
		index => "logstash-%{type}-%{+YYYY.MM.dd}"
		document_type => "%{type}"
		sniffing => true
	}
}
格式二

日志内容 /guaoran/elk/webapps/logs/demo/

127.0.0.1 - - [11/Nov/2018:00:01:02 +0800] "POST /api3/getrelevantcourse HTTP/1.1" 200 774 "www.guaoran.cn" "-" cid=608&secrect=xx&timestamp=1478707262003&token=xx&uid=4203162 "guaoran/5.0.2 (iPhone; iOS 10.0.1; Scale/3.00)" "-" 127.0.0.1:80 200 0.048 0.048
127.0.0.1 - - [11/Nov/2018:00:01:18 +0800] "POST /course/ajaxmediauser HTTP/1.1" 200 54 "www.guaoran.cn" "http://www.guaoran.cn/code/1883" mid=1883&time=60 "Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko" "-" 127.0.0.1:80 200 0.016 0.016
127.0.0.1 - - [11/Nov/2018:00:01:18 +0800] "HEAD / HTTP/1.1" 301 0 "127.0.0.1" "-" - "curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.16.2.3 Basic ECC zlib/1.2.3 libidn/1.18 libssh2/1.4.2" "-" - - - 0.000

启动配置 /guaoran/elk/logstash-6.5.1/config/demo.conf

input{
    file{
        path => "/guaoran/elk/webapps/logs/demo/access.*.log"
        #sincedb_path => "/dev/null"
		start_position => "beginning"
    }
}

filter{
    if [@metadata][debug] {
        mutate{ remove_field => ["headers"] }
    }

    grok{
        match => {
           "message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:[@metadata][timestamp]}\] "%{WORD:verb} %{DATA:request} HTTP/%{NUMBER:httpversion}" %{NOTSPACE:response_status_code} (?:%{NUMBER:bytes}|-) %{QS:hostname} %{QS:referrer} (?:-|%{DATA:params}) %{QS:agent} %{QS:xforwardedfor} (?:-|%{MY_URI:upstream_host}) (?:-|%{MY_RESP:upstream_response_status_code}) (?:-|%{MY_RESP_TIME:upstream_response_time}) %{BASE10NUM:response_time:float}'
        }
        pattern_definitions=>{
            "MY_URI" => '%{URIHOST}(, %{URIHOST})*'
            "MY_RESP" => '%{NUMBER}(, %{NUMBER})*'
            "MY_RESP_TIME" => '%{BASE10NUM}(, %{BASE10NUM})*'
        }
    }
    date{
        match => ["[@metadata][timestamp]","dd/MMM/yyyy:HH:mm:ss Z"]
    }
    mutate{
        split => {"upstream_host" => ", "}
        split => {"upstream_response_status_code" => ", "}
        split => {"upstream_response_time" => ", "}
        gsub => ["hostname",'"','']
    }
    mutate{
        convert => {"upstream_response_time"=>"float"}
    }
    geoip{
        source => "clientip"
    }
    useragent{
        source => "agent"
        target => "useragent"
    }
    mutate{
        add_field => {
            "[@metadata][index]" => "nginx_logs_%{+YYYY.MM.dd}"
        }
    }

    if [referrer] =~ /^"http/ {
        grok{
            match => {
                "referrer" => '%{URIPROTO}://%{URIHOST:referrer_host}'
            }
        }
        if "guaoran.cn" in [referrer_host] {
            grok{
                match => {
                    "referrer" => ['%{URIPROTO}://%{URIHOST}/(%{NOTSPACE:demo_type}/%{NOTSPACE:demo_res_id})?"','%{URIPROTO}://%{URIHOST}/(%{NOTSPACE:demo_type})?"']
                }
            }
        }
    }
    mutate{
        gsub => ["referrer",'"','']
    }
    if "_grokparsefailure" in [tags] {
        mutate{
            replace => {
                "[@metadata][index]" => "nginx_logs_parsefailure_%{+YYYY.MM.dd}"
            }
        }
    }else{
        mutate{remove_field=>["message"]}
    }
}

output{
	if [@metadata][debug]{
		stdout{
			codec => rubydebug{
				metadata => true
			}
		}
	}else{
		#stdout{
		#	codec=>rubydebug
		#}
		elasticsearch{
			index => "%{[@metadata][index]}"
		}
	}
}
启动 logstash
cd /guaoran/elk/logstash-6.5.1/
bin/logstash -f config/demo.conf -r
Logstash 与 ElasticSearch 集成

当需要将 tomcat 的 access 日志文件保存到 ElasticSearch 中时,在 logstash.confoutput 里面追加以下内容

elasticsearch {
	hosts => ["http://192.168.192.129:9200"]
	index => "logstash-%{type}-%{+YYYY.MM.dd}"
    document_type => "%{type}"
    sniffing => true
}

kibana 进行视图展示

官网用户手册

当我们访问 tomcat 应用时 ,logstash 启动窗口 会输出日志,日志如下

{
      "timestamp" => "08/May/2019:10:28:36 +0800",
          "bytes" => "10809",
       "response" => "200",
    "httpversion" => "1.1",
     "@timestamp" => 2019-05-08T02:28:36.000Z,
           "path" => "/guaoran/elk/webapps/logs/localhost_access_log.2019-05-08.log",
           "type" => "tomcat_access",
           "host" => "localhost.localdomain",
           "verb" => "POST",
       "clientip" => "192.168.192.1",
        "message" => "192.168.192.1 - - [08/May/2019:10:28:36 +0800] \"POST /index.html HTTP/1.1\" 200 10809",
       "@version" => "1",
        "request" => "/index.html"
}
匹配索引

我们在正式使用Kibana之前,需要先匹配我们Elasticsearch中的索引库,因为我们的Elasticsearch有可能会有很多索引库,Kibana为了性能因素,是不会事先把所有的索引库都导进来的,我们需要用那个索引就导哪个索引。

查询日志插入到 ES 上创建的索引

点击 kibana 菜单 monitoring,点击 ElasticsearchOverview ,点击 Indices,或者通过下面的步骤也可以看到存在哪些索引

导入索引

点击 kibana 菜单 Management,点击 KibanaIndex Patterns ,点击 Create index pattern

在这里插入图片描述

上图中方框便是所有的索引,在 Index pattern 中选择需要的索引,可以正则匹配。输入好之后点击 Next step ,选择时间过滤字段 @timestamp ,点击 Create index pattern ,完成匹配。

下图可以看到对应的字段在这里插入图片描述

探索数据

点击 kibana 菜单 Discover,点击 KibanaIndex Patterns ,点击 Create index pattern

可选择对应的索引,数据过滤或者展示某些感兴趣的字段

在这里插入图片描述

数据可视化

点击 kibana 菜单 Visualize,点击 + 按钮,进行选择对应的视图

Logstash 、Kibana 时区问题

logstsh 的时区问题
复现问题

nginx 日志格式和配置 为例

日志内容:

192.168.20.2 200 [14/May/2019:00:00:02 +0800] 192.168.20.44 [14/May/2019:00:00:02 +0800] /dataManual/dataIndustryTree.html "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36" "-" 11329 0.008 -

当开启 stdout

stdout{
	codec=>rubydebug
}

logstash 启动时读取日志时的输出为:

{
	"request" => "/dataManual/dataIndustryTree.html",
	...
  	"@timestamp" => 2019-05-13T16:00:02.000Z,
  	...
    "timestamp" => "14/May/2019:00:00:02 +0800"
}

此时会有两个问题:

  1. 我们读取的时间为 14/May/2019:00:00:02 +0800 但是 @timestamp 时间却少了八个小时
  2. 打开 kibana 查看生成的索引文件,索引文件名称却不是 xxxx-2019.05.14 ,而是 xxxx-2019.05.13 可知索引文件的名称是通过 @timestamp 的时间而定的
解决问题

在配置文件中的 filter 中加入以下配置

date {
	match => ["timestamp","dd/MMM/YYYY:HH:mm:ss Z"]
	target => "@timestamp"
}
ruby {
	code => "event.set('timestamp', event.get('@timestamp').time.localtime + 8*60*60)"
}
ruby {
	code => "event.set('@timestamp',event.get('timestamp'))"
}
mutate {remove_field => ["timestamp"]}

再次执行导入,输出结果

{
	"request" => "/dataManual/dataIndustryTree.html",
    "@timestamp" => 2019-05-14T00:00:02.000Z,
    "timestamp" => 2019-05-14T00:00:02.000Z
}

此时查看上面的两个问题已经解决了

kibana 的时区问题
复现问题

当我们在 kibanaDiscover 中对刚刚生成的索引文件进行展示时,如图所示

在这里插入图片描述

上图中的时间展示却又多个8个小时,通过 postman 查看该索引的内容,如下

在这里插入图片描述

可以得到 存储的内容是正确的,但是 kibana 中展示不正确,可得又是 kibana 的时区问题

解决问题

修改 kibana 的时区

点击 kibana 的菜单 Management ,点击 Advanced Settings

在这里插入图片描述

dateFormat:tzBrowser 改成 Etc/UTC

在这里插入图片描述

保存后再次点击 Discover 会发现问题解决

kibana 通过 nginx 权限认证

设置 用户名为 admin ,密码为 123456 存储到 htpasswd

sudo htpasswd -c -b /usr/local/nginx/conf/extra/htpasswd admin 123456

nginx.conf

server {
        listen       5602;
        access_log /mnt/logs/nginx/kibana.access.log main;

        location / {
            proxy_redirect         off;
            proxy_set_header X-Real-IP $remote_addr;
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
            proxy_set_header Host $http_host;
            proxy_pass  http://kibanaServer;
            auth_basic "kibana login auth";
            auth_basic_user_file /usr/local/nginx/conf/extra/htpasswd;
        }
}
  • 0
    点赞
  • 6
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值