日志分析界面:

logstash(分析端) + elashsearch(存储端) + kibana(展示端)

工具:

进行数据整理

statsd

1、diamond --> statsd --> graphite

2、实时收集数据的做法:

logstash --> statsd --> graphite

日志收集可视化(LEK):

logstash + elasticsearch + kibana

legend: logstash --》 redis --》 ElasticSearch

logstash包括两种工作方式:

standalone : logstash on a single

logstash ---> elasticsearch ---> kibana

centralized: logstash with multiple servers

    _________________________________________________________________________________

|client  |                         server                                       |

|________|______________________________________________________________________|

|logstash|    |redis  |     |logstash|     | elasticsearch  |    | kibana      |

|________|_____|_______|_____|________|_____|________________|____|_____________|

|shipper | --->|       |     |        |     | |  | |

| ... | --->| broker| --->| indexer| --->|storage & search|--->|web interface|

|shipper | --->|       |     |  |    |  |  | |

|________|_____|_______|_____|________|_____|________________|____|_____________|


LEK安装:

测试机:X.X.X.X

需要java环境(java -version)

install JDK:

wget http://down1.chinaunix.net/distfiles/jdk-6u13-dlj-linux-i586.bin 

sh jdk-6u13-dlj-linux-i586.bin

配置环境变量:

vim /etc/profile

export JAVA_HOME=/usr/java 

export PATH=$JAVA_HOME/bin:$PATH 

export CLASSPATH=.:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/dt.jar:$CLASSPATH

安装路径定义:

DIR = /root/lek/

1、安装elasticsearch

mkdir $DIR/elasticsearch

方式1:

cd elasticsearch

wget http://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.90.5.zip

unzip elasticsearch-0.90.5.zip

elasticsearch解压即可使用,启动方式:

cd elasticsearch-0.90.5

bin/elasticsearch -f

访问测试:

curl -X GET http://192.168.70.31:9200

方式2:(个人感觉这种方式方便一些)

rpm包的方式,rpm -ivh elasticsearch-0.90.9.noarch.rpm

/etc/init.d/elasticsearch start

elasticsearch head -->分布式搜索elasticsearch集群管理工具head 

/usr/share/elasticsearch/bin/plugin -install mobz/elasticsearch-head

http://192.168.70.31/_plugin/head/

3、安装logstash

mkdir $DIR/logstash

cd logstash

wget http://download.elasticsearch.org/logstash/logstash/logstash-1.2.1-flatjar.jar

logstash 下载即可使用,命令行参数可参考http://logstash.net/docs/1.2.1/flags

4、安装kibana

mkdir $DIR/kibana

cd kibana

wget http://download.elasticsearch.org/kibana/kibana/kibana-latest.zip

unzip kibana-latest.zip

cp -r  kibana-latest /var/www/html

访问地址:

http://X.X.X.X/kibana-latest/index.html

修改配置文件:

vim /var/www/html/kibana-latest/config.js

elasticsearch: "http://X.X.X.X:9200"  

增加中国地图:

将map.cn.js文件放在/var/www/html/kibana-latest/app/panels/map/lib/下

修改/var/www/html/kibana-latest/app/panels/map/editor.html

将['world','europe','usa']修改成['world','europe','usa','cn']即可

5、lek启动程序

创建logstash配置文件

cd logstash

vim simple.conf

input {

  file {

    type => "syslog"

    path => "/var/log/httpd/access_log"

  }

}

output {

  elasticsearch { 

  #embedded => true 

  host => "elasticsearch ip"

  }  

}

用这个配置文件启动logstash agent

使用logstash 自带的elasticsearch:

java -jar logstash-1.2.2-flatjar.jar agent -f simple.conf -- web &

使用单独elasticsearch:

java -jar logstash-1.2.2-flatjar.jar agent -f simple.conf &

修改完kibana后重启httpd,然后就可以通过kibana的界面看到日志情况

6、logstash高级应用

splunk日志处理和分析最好的产品,但是是付费的。免费的只支持500M日志

在logstash里边去过滤php日志

input {

file {

path => "/var/log/php_erross.log"

type => "phperror"

}

}

filter {

if [type] == "phperror" {

grok {

patterns_dir => "./p"    #p里边存放的为匹配模式

match => ["message","%{PHP_LOG}"] 

}

date {

match => [ "timestamp","","dd-MMM-yyyy HH:mm:ss" ]

}

punct {}

}

}

output {

elasticsearch { embedded => true }

}

文件夹p下面的内容:p/phperror

内容来自于github

PHP_TS  ...

PHP_LOG ...