logstash+kafka
1、编译安装logstash,进入目录
2、cd到/logstash/config/
3、vim error.conf
input {
file {
path => [ “/wutong/error.log” ] ###日志存放目录
type => “Active”
start_position => “beginning”
}
}
filter{
mutate{
remove_field => [“host”]
remove_field => [“agent”]
remove_field => [“ecs”]
remove_field => [“tags”]
remove_field => [“fields”]
remove_field => ["@version"]
remove_field => ["@timestamp"]
remove_field => [“input”]
remove_field => [“log”]
remove_field => [“type”]
remove_field => [“path”]
}
}
output {
kafka {
codec => json ###日志格式
topic_id => “test1” ###kafka名字
bootstrap_servers => “192.168.10.112:9092” ###kafka-host
batch_size => 1
}
}
4、在bin目录下执行
./logstash -f …/config/error.conf
5、kafka服务器在/kafka/bin下执行
./kafka-console-consumer.sh --bootstrap-server 192.168.10.112:9092 --topic test1 --from-beginning、
6、####问题
输出的日志内容过滤掉“message”这个单词 (未解决)
###问题解决
在output里面
output {
kafka {
topic_id => “test1”
bootstrap_servers => “192.168.10.112:9092”
batch_size => 1
codec => plain {
format => “%{message}”
}
}
}