springboot项目使用log4j2 日志框架
1、log4j-spring.xml文件
<appenders>
<!--统一日志-登录日志--> <Kafka name="InOutLog" topic="unified_login"> <RegexFilter regex=".*Login.*" onMatch="ACCEPT" onMismatch="DENY"/> <PatternLayout charset="UTF-8" pattern="%msg"/> <Property name="bootstrap.servers">127.0.0.1:9092</Property> </Kafka> <!-- 异步 Appender --> <Async name="asyncKafka"> <AppenderRef ref="InOutLog"/> </Async>
</appenders>
<loggers> <root level="info"> <AppenderRef ref="asyncKafka"/> </root>
</loggers>
2、JAVA代码发送日志
@GetMapping("/sendlog") public void test(){ System.out.println("开始发送"); UnionLogDto unionLogDto = new UnionLogDto(); unionLogDto.setDescription("OA单点"); unionLogDto.setActionType(ActionType.Create); unionLogDto.setTag(new String[]{"LoginTag","123456"}); log.info(JSON.toJSONString(unionLogDto)); System.out.println("发送成功"); }
3、logstash.xml文件
使用ruby 插件对json中的数组进行特殊处理
input{
kafka{
bootstrap_servers => ["127.0.0.1:9092"]
client_id => "test"
group_id => "test"
auto_offset_reset => "latest"
consumer_threads => 5
decorate_events => true
topics => ["unified_login"]
codec => "json"
}
}
filter {
ruby {
code => "
event.set('tag.tag0', event.get('[tag]').to_a[0])
event.set('tag.tag1', event.get('[tag]').to_a[1])
"
}
mutate {
remove_field => ["tag"]
}
}
output {
opensearch {
hosts => ["http://127.0.0.1:9211","http://127.0.0.1:9212","http://127.0.0.1:9213"]
index => "login-test"
user => "admin"
password => "XXXXXXXXX"
action => "create"
}
}
4、opensearch dashboard 如下: