LogStash实操
简单的从本地输入,输出到本地
logstash -e 'input{stdin{}}output{stdout{codec=>rubydebug}}'
从本地输入,输出到es
logstash -e 'input{stdin{}} output {elasticsearch{hosts=>["hadoop111:9200"]}}'
实际应用中,都是将logstash三大组件写到一个文件中去执行
使用logstash采集日志文件中的数据到kafka集群
input {
file {
codec => plain {
charset => "UTF-8"
}
path => "/test/logstashData/*"
discover_interval => 5
start_position => "end"
}
}
output {
kafka {
topic_id => "gamelogs-prj"
codec => plain {
format => "%{message}"
charset => "UTF-8"
}
bootstrap_servers => "hadoop111:9092"
}
}
- 执行 ./bin/logstash -f ./xxx.conf
- 向指定的目录中导入数据文件,会发现kafka多了一个gamlogs-prj主题,并且从beginning正常消费数据
从kafka集群中使用logstash采集数据输送到es集群中
input {
kafka {
type => "gamelogs-prj"
auto_offset_reset => "earliest"
codec => "plain"
group_id => "elas-prj"
topics => "gamelogs-prj"
bootstrap_servers => "hadoop111:9092"
}
}
filter {
if [type] == "gamelogs-prj" {
mutate {
split => { "message" => "|" }
add_field => {
"event_type" => "%{message[0]}"
"current_time" => "%{message[1]}"
"user_ip" => "%{message[2]}"
"user" => "%{message[3]}"
}
remove_field => [ "message" ]
}
}
}
output {
if [type] == "gamelogs-prj" {
elasticsearch {
index => "gamelogs-prj"
codec => plain {
charset => "UTF-16BE"
}
hosts => ["hadoop111:9200"]
}
}
}
- 执行 ./bin/logstash -f ./xxx.conf
- 会发现es集群中增加了一个gamlogs-prj的索引库,并且数据也已经写进去了