长亭WAF社区版联动企业微信报警

该文介绍了如何使用Logstash从PostgreSQL数据库中增量获取日志,通过自建Webhook进行格式调整和颜色优化,然后发送到企业微信机器人。Logstash配置包括JDBC输入插件和HTTP输出插件,Webhook部分涉及脚本处理和模板替换。此外,还提到了Nginx作为代理服务器,用于免登录查看WAF日志。
摘要由CSDN通过智能技术生成

一、Logstash

1.1 logstash容器安装(加入safeline-ce容器网)

docker run -di --restart=always --log-driver json-file  -p 5044:5044 -p 9600:9600 --name logstash  --net safeline-ce  logstash:8.8.1

1.2 logstash配置

注意事项:

1.jdbc_driver_library的jar包需要单独下载;

2.waf的日志根据id进行增量更新,id递增则logstash进行output。其中记录id号的user.metadata空文件需要单独创建。其中查询的SQL语句statement  最后需要写明增量条件 > :sql_last_value;

3.日志量较大的话,分页进行配置;

4.Output模块中理论上可以直接通过http的发送给数据给企业微信机器人的webhook,不过这边是通过发送到本地的webhook做了相关makedown的格式调整以及颜色和超链接的优化,再由本地webhook发送至企业微信机器人的webhook。;

5.在38行output中pushwechatalert为后文webhook配置waf.json的id,需要一致。

input {  
jdbc {  
jdbc_connection_string => "jdbc:[postgresql://169.254.0.2:5432/safeline-ce]()"  
jdbc_user => "safeline-ce"  
jdbc_password => "F6epaIfxxxxxxxxxxxxxxx64dKbUhhc"  
jdbc_driver_library => "/usr/share/logstash/jdk/bin/pgsql/postgresql-42.6.0.jar"  
jdbc_driver_class => "org.postgresql.Driver"  
jdbc_paging_enabled => "true"  
jdbc_page_size => "300000"  
use_column_value => "true"  
tracking_column => "id"  
tracking_column_type => "numeric"  
record_last_run => "true"  
clean_run => false  
last_run_metadata_path => "/usr/share/logstash/jdk/bin/pgsql/user.metadata"  
statement => "SELECT mgt_detect_log_basic.id as id,timestamp,host,url_path,src_ip,method,query_string,mgt_detect_log_basic.event_id as event_id FROM mgt_detect_log_basic,mgt_detect_log_detail where 1=1 and mgt_detect_log_basic.id = mgt_detect_log_detail.id and mgt_detect_log_basic.id  > :sql_last_value"  
schedule => "* * * * *"  
type => "jdbc"  
jdbc_default_timezone =>"Asia/Shanghai"  
}  
}  
filter {  
json {  
source => "message"  
}  
ruby {  
code => "event.timestamp.time.localtime"  
}  
mutate {  
remove_field => [ "@timestamp","@version","type" ]  
}  
}  
  
output {  
http {  
http_method => "post"  
format => "json"  
url => "<http://127.0.0.1:8888/hooks/pushwechatalert>"  
content_type => "application/json"  
}  
stdout {  
codec => rubydebug {}  
}  
}

1.3 logstash配置启动(主容器已经起了一个logstsh进程,单独再起一个要指定–path.data)

nohup logstash -f /usr/share/logstash/jdk/bin/pgsql/logstas-pgsql.conf --path.data=/usr/share/logstash/jdk/bin/pgsql/data > /dev/null 2>&1 &

二、自建webhook

2.1 webhook安装(写者是安装在logstash所在的容器里)

wget https://github.com/adnanh/webhook/releases/download/2.8.1/webhook-linux-amd64.tar.gz
tar -zxvf webhook-linux-amd64.tar.gz -C ./
cp webhook-linux-amd64/webhook /usr/bin/

2.2 webhook配置(三个配置文件)

waf.json(execute-command执行推送企业微信报警脚本)

[
{
"id": "pushwechatalert",
"execute-command" : "/usr/share/logstash/jdk/bin/pgsql/webhook/ctwaf.sh",
"pass-arguments-to-command":
[
{
"source":"entire-payload",
#"name":"parameter-name"
}
]
}
]

ctwaf.sh(6-11行提取logstash output过来的日志,提取出字段,12-17行对企业微信报警模板文件alert-waf.json进行字段替换。18行微信机器人webhook地址,19行更新模板。)

#!/bin/bash
PAYLOAD= $1
echo $1 >> /var/log/test.log
PAYLOAD2=echo $1  
echo $PAYLOAD2  >> /var/log/payload2.log
CT_Event=echo $PAYLOAD2 |  awk -F ',' {'print $1'}  | awk -F ':' {'print $2'}  | sed 's/"\(.*\)"/\1/g'
CT_Time=echo $PAYLOAD2 |  awk -F ',' {'print $7'}  | awk -F ':' {'print $2'}  | xargs -I {}  date -d @{} +"%Y-%m-%d %H:%M:%S"
CT_Method=echo $PAYLOAD2 |  awk -F ',' {'print $4'} | awk -F ':' {'print $2'} | sed 's/"\(.*\)"/\1/g'
CT_Server=echo $PAYLOAD2 |  awk -F ',' {'print $2'} | awk -F ':' {'print $2'} | sed 's/"\(.*\)"/\1/g'
CT_Url=echo $PAYLOAD2 |  awk -F ',' {'print $8'}  | awk -F '"' {'print $4'}
CT_Sip=echo $PAYLOAD2 | awk -F ',' {'print $6'} | awk -F ':' {'print $2'} | sed 's/"\(.*\)"/\1/g'
sed -i "s/CT_Event/$CT_Event/g" /usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json
sed -i "s/CT_Time/$CT_Time/g" /usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json
sed -i "s/CT_Method/$CT_Method/g" /usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json
sed -i "s#CT_Url#$CT_Url#g" /usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json
sed -i "s/CT_Server/$CT_Server/g" /usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json
sed -i "s#CT_Sip#$CT_Sip#g" /usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json
curl -H "Content-Type: application/json" -X POST -d @/usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json https://qyapi.weixin.qq.com/cgi-bin/webhook/send?key=xxxxx
cp -f /usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json.tm /usr/share/logstash/jdk/bin/pgsql/webhook/alert-waf.json

alert-waf.json(11-12行的192.168.1.1地址非WAF地址,为另外一台nginx,用来增加用户的cookie头并代理waf实现免登录查看报警。)


{
"msgtype": "markdown",
  "markdown": {
    "content": "报警类型:<font color=\"info\">长亭</font>,请注意。\n>
报警时间:<font color=\"comment\">CT_Time</font>\n>
请求方法:<font color=\"comment\">CT_Method</font>\n>
请求路径:<font color=\"comment\">CT_Url</font>\n>
请求域名:<font color=\"comment\">CT_Server</font>\n>
攻击IP:<font color=\"comment\">CT_Sip</font>\n>
[攻击报文](http://192.168.1.1:8888/api/DetectLogDetail?event_id=CT_Event)\n>
[免登录查看攻击详情](http://192.168.1.1:8888/logs?page=1&size=20&params=%7B%7D)\n>"
  }
}

2.3 启动webhook

nohup webhook -hooks waf.json -port 8888 --verbose  > /dev/null 2>&1 &

三、NGINX免登录查看WAF日志。

单独安装了一个nginx容器,其中proxy_pass为waf登录地址,proxy_pass_header为用户cookie),并配置定时任务去访问waf,防止cookie失效。

图1.png

来源:热心雷池社区版用户

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值