使用Logstash作为收集端采集IIS日志


  • LS客户端配置文件
input中使用file及eventlog模块分别对IIS及Windows Event Log进行采集;filter中用grok对日志进行分隔处理;output将过滤后的日志传至redis。
input {
  file {
    type => "IIS"
    path => "D:\iislog\xxx.xxx.com\W3SVC663037409/*.log"
    codec => plain {
charset => "ISO-8859-1"
    }
  }
}

input {
  eventlog {
    type => 'Win32-EventLog'
    logfile => ["Application", "System", "Security"]
  }
}

filter {
  #ignore log comments
  if [message] =~ "^#" {
    drop {}
  }

  grok {
    # check that fields match your IIS log settings
    match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} (%{WORD:s-sitename}|-) (%{IPORHOST:s-ip}|-) (%{WORD:cs-method}|-) %{NOTSPACE:cs-uri-stem} %{NOTSPACE:cs-uri-query} (%{NUMBER:s-port}|-) (%{IPORHOST:c-ip}|-) %{NOTSPACE:cs-useragent} %{NOTSPACE:cs-referer} %{NOTSPACE:cs-host} (%{NUMBER:sc-status}|-) (%{NUMBER:sc-bytes}|-) (%{NUMBER:cs-bytes}|-) (%{NUMBER:time-taken}|-)"]
  }
  #Set the Event Timesteamp from the log
date {
    match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
 timezone => "Etc/UCT"
  }
  mutate {
remove_field => [ "log_timestamp"]
convert => [ "sc-bytes", "float" ]
convert => [ "cs-bytes", "float" ]
convert => [ "time-taken", "float" ]
  }
}

output {
    #stdout { codec => rubydebug }
    redis {
    host => "192.168.xx.xxx"
    data_type =>"list"
    key => "test:redis"
   }
}
注意:
1. charset是为了解决logstash字符集的问题,如果按默认值“UTF-8”可能还是无法识别一些特殊字体,而修改为“ISO-8859-1”后便可以正确的读入了:
[33mReceived an event that has a different character encoding than you configured. {:text=>"2014-12-22 14:22:52 /newDict/jp/improve_new.aspx sourcedict=1&jid=322316&wtype=jc&w=\\xCD?\\xFC 192.168.31.190 HTTP/1.1 Mozilla/4.0 - dict.hjenglish.com 200 5043 342\\r", :expected_charset=>"UTF-8", :level=>:warn}[0m[33mInterrupt received. Shutting down the pipeline. {:level=>:warn}[0m


2. 如果要调用eventlog模块,必须安装“contrib plugins”,否则服务无法启动:
报错:“LoadError: no such file to load -- jruby-win32ole”


3. 针对数值型字段,必须使用mutate中的convert强制转换字段类型,否则还是字符型,ES后期无法对其使用聚集方法(mean/total/max/etc.)
报错:

ELK日志错误:

[2015-01-07 11:43:02,464][DEBUG][action.search.type       ] [Prester John] [logstash-2015.01.07][0], node[wL6TfFyxQI2fmsDWHI-bdA], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@61fc4bf9] lastShard [true]

org.elasticsearch.search.SearchParseException: [logstash-2015.01.07][0]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"0":{"date_histogram":{"key_field":"@timestamp","value_field":"time-taken","interval":"1s"},"global":true,"facet_filter":{"fquery":{"query":{"filtered":{"query":{"query_string":{"query":"cs-host:(dict.*)"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"from":1420601882243,"to":1420602182243}}},{"terms":{"s-ip.raw":["192.168.33.31"]}}]}}}}}}}},"size":0}]]

        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:681)

        at org.elasticsearch.search.SearchService.createContext(SearchService.java:537)

        at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:509)

        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:264)

        at org.elasticsearch.search.action.SearchServiceTransportAction$5.call(SearchServiceTransportAction.java:231)

        at org.elasticsearch.search.action.SearchServiceTransportAction$5.call(SearchServiceTransportAction.java:228)

        at org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:559)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

        at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.ClassCastException: org.elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be cast to org.elasticsearch.index.fielddata.IndexNumericFieldData

        at org.elasticsearch.search.facet.datehistogram.DateHistogramFacetParser.parse(DateHistogramFacetParser.java:199)

        at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)

        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:665)

        ... 9 more



4. 对于无法正确获得客户端IP而取到了前端HAProxy或F5或CDN的IP地址的话,需要使用“X-Forwarded-For”,可以看下F5官方提供的文档:
https://devcentral.f5.com/articles/x-forwarded-for-log-filter-for-windows-servers

5. 如何对grok表达式进行调试,三斗室大大也早有过介绍了,
http://chenlinux.com/2014/10/19/grokdebug-commandline/

6. IIS日志域必须与grok中的配置项精确匹配,否则会出现grok匹配失败,返回“_grokparsefailure”的情况:
7.匹配规则中获取具体哪些字段需要看iis日志的格式
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值