aws vpc日志_AWS VPC流日志和Amazon Athena进行网络监控

aws vpc日志

A step-by-step guide to building a network-monitored environment in AWS

在AWS中构建网络监控环境的分步指南

VPC流日志 (VPC Flow Logs)

VPC Flow Logs is a feature that enables you to capture information about the IP traffic going to and from network interfaces in your VPC. Flow log data can be published to Amazon CloudWatch Logs or Amazon S3.

VPC流日志是一项功能,使您可以捕获有关往返VPC中网络接口的IP流量的信息。 流日志数据可以发布到Amazon CloudWatch Logs或Amazon S3。

亚马逊雅典娜 (Amazon Athena)

Amazon Athena is an easy to use and interactive query service that makes it easy to analyze data in Amazon S3 using standard SQL. Athena is serverless, so there is no infrastructure to manage, and you pay only for the queries that you run.

Amazon Athena是一种易于使用的交互式查询服务,可让您轻松使用标准SQL分析Amazon S3中的数据。 Athena是无服务器的,因此无需管理任何基础结构,您只需为运行的查询付费。

亚马逊CloudWatch (Amazon CloudWatch)

CloudWatch is a kind of all-in-one service about Logs and Metrics, providing observability of your AWS resources in so many ways by doing so many actions, and integrating with so many products.

CloudWatch是一种与日志和指标有关的多合一服务,它通过执行许多操作并与许多产品集成,以多种方式提供您的AWS资源的可观察性。

我们将建立什么 (What we will build)

By the end of the day, we will have a monitored network environment in our VPC in a professional way. To get there, we will learn how to use and integrate those AWS services listed above in order to help us with analyzing networking traffic tasks, plus, to get notified for threats. 🧐

到一天结束时,我们将以专业的方式在VPC中拥有受监视的网络环境。 到那里,我们将学习如何使用和集成上面列出的那些AWS服务,以帮助我们分析网络流量任务,并获得有关威胁的通知。 🧐

Image for post

启用VPC流日志并将其与S3集成 (Enabling VPC Flow Log and integrating it with S3)

Navigate to S3 and create a bucket, give it a name and copy the bucket ARN, we’ll use it later on.

导航到S3并创建一个存储桶,为其命名并复制存储桶ARN,稍后我们将使用它。

  • Create the Bucket in the same region as your VPC.

    在与VPC相同的区域中创建存储桶。
  • Enable Encryption, we are handling sensitive data.

    启用加密,我们正在处理敏感数据。

There is so much to talk about Server-Side Encryption in S3, to remains focused select the AES-256 option and carry on, I’ll reference some links about it at the end of this article.

关于S3中的服务器端加密,有太多要讨论的话题,要集中精力选择AES-256选项并继续进行下去,我将在本文结尾处引用有关它的一些链接。

Navigate to your VPC and click on the Create flow log button.

导航到您的VPC,然后单击“ 创建流日志”按钮。

  • In the Filter option, select All.

    在过滤器选项中,选择全部

  • At Destination, set Send to an S3 bucket and put the bucket ARN that you have created earlier.

    在“目标”处,将“ 发送到”设置为S3存储桶,然后将您先前创建的存储桶ARN放入。

  • Leave every else as is and create the flow log.

    保留其他所有内容并创建流日志。

AWS will automatically create a Resource Policy into your Bucket in order to grant all permisions needed for VPC Flow Log.

AWS会自动在您的存储桶中创建一个资源策略,以授予VPC Flow Log所需的所有权限。

将VPC Flow Log与CloudWatch集成 (Integrate VPC Flow Log with CloudWatch)

In order to create a CloudWatch Log Group, navigate to CloudWatch, click on Log Groups at the left-hand menu, then click on the Create log group button and give vpc/flowlogs as the name.

为了创建一个 CloudWatch Log Group,导航到CloudWatch,单击左侧菜单上的Log Groups ,然后单击Create log group按钮并给vpc / flowlogs作为名称。

Create another VPC Flow Log, but this time, select the log group that you have created above as destination.

创建另一个VPC流日志,但是这次,选择您在上方创建的日志组作为目标。

Note that you will need to select an IAM role now, AWS will not create one for you as previously.

请注意,您现在需要选择一个IAM角色,AWS不会像以前那样为您创建一个。

In another tab, create a new IAM Role DeliverVPCFlowLogsRole with a Trusted relationship with vpc-flow-logs.amazonaws.com and attach to it the policy document below.

在另一个选项卡中,创建一个新的IAM角色DeliverVPCFlowLogsRolevpc-flow-logs.amazonaws.com信任关系,重视它下面的政策文件。

Go back to your VPC Flow Log and select the IAM Role you have created and hit Create.

返回您的VPC流日志并选择您创建的IAM角色,然后点击创建

CloudWatch指标过滤器 (CloudWatch Metric Filters)

Go to your CloudWatch Log Group and create a new Metric Filter.

转到您的CloudWatch Log Group并创建一个新的指标筛选器。

In order to track failed SSH attempts in your VPC enter the filter pattern below.

为了跟踪您的VPC中失败的SSH尝试 ,请在下面输入过滤器模式。

[version, account, eni, source, destination, srcport, destport="22", protocol="6", packets, bytes, windowstart, windowend, action="REJECT", flowlogstatus]

In the Test Pattern section, Select Custom Log Data and enter the following as the Log event messages and then, click Next.

在“ 测试模式”部分中,选择“ 自定义日志数据”,然后输入以下内容作为日志事件消息,然后单击“ 下一步”

2 086112738802 eni-0d5d75b41f9befe9e 61.177.172.128 172.31.83.158 39611 22 6 1 40 1563108188 1563108227 REJECT OK
2 086112738802 eni-0d5d75b41f9befe9e 182.68.238.8 172.31.83.158 42227 22 6 1 44 1563109030 1563109067 REJECT OK
2 086112738802 eni-0d5d75b41f9befe9e 42.171.23.181 172.31.83.158 52417 22 6 24 4065 1563191069 1563191121 ACCEPT OK
2 086112738802 eni-0d5d75b41f9befe9e 61.177.172.128 172.31.83.158 39611 80 6 1 40 1563108188 1563108227 REJECT OK

At the Assign metric section, change the filter name to ssh-reject and the metric name as SSH Rejects.

在“ 分配指标”部分,将过滤器名称更改为ssh-reject ,并将指标名称更改为SSH Rejects

Set 1 for the Metric value field, click on Next, and then Save Changes.

将“度量值”字段设置为1 ,单击“ 下一步”,然后单击 保存更改”

公制过滤器警报 (Metric Filter Alarm)

Select your ssh-rejects metric filter and click Create Alarm.

选择ssh-rejects指标过滤器,然后点击创建警报

Select Greater/Equal for threshold.

选择大于/等于作为阈值

In the Define the threshold value field enters 1 and click on Next.

在“ 定义阈值”字段中,输入1并单击“ 下一步”

Create a new topic and to its name use SSH_DENIED_ACCESS_IN_VPC.

创建一个新主题,其名称使用SSH_DENIED_ACCESS_IN_VPC

Enter your email in order to receive those notifications and then click on Next.

输入您的电子邮件以接收这些通知,然后单击“ 下一步”

Enter SSH Rejects for Alarm name, click on Next, and then Create Alarm button.

输入“ SSH Rejects”作为警报名称,单击“ 下一步” ,然后单击 创建警报”按钮。

You will need to confirm your subscription to this topic in your email box.

您将需要在电子邮件框中确认您对该主题的订阅。

CloudWatch见解 (CloudWatch Insights)

Let´s run some ad-hoc queries directly on your logs from CloudWatch now, for that go to Insights menu, click on Queries icon folder in the right-hand menu, and select Sample queries > VPC flow log queries > Top 20 source IP addresses with the highest number of rejected requests and select your Log Group.

现在,让我们直接从CloudWatch对您的日志运行一些即席查询,然后转到“ 见解”菜单,单击右侧菜单中的“ 查询”图标文件夹,然后选择“ 示例查询 > VPC流日志查询 > 前20个源IP”拒绝请求数量最多的地址,然后选择您的日志组。

Play around with other queries, you will see some very useful ones, take a look at amazon docs later, note how the query could be written, its easy and a bit closer to ANSI92 SQL syntax, but in order to use standard SQL as it is in ad hoc queries the best options is Athena.

尝试其他查询,您将看到一些非常有用的查询,稍后再查看亚马逊文档,请注意如何编写查询,该查询简单易用,并且更接近ANSI92 SQL语法,但是为了使用标准SQL在临时查询中,最好的选择是雅典娜。

亚马逊雅典娜 (Amazon Athena)

Navigate to Athena.

导航到雅典娜。

Use the following code, replacing the fields between <> related to your S3 Flow Log bucket location and run the query.

使用以下代码,替换与S3 Flow Log存储桶位置相关的<>之间的字段,然后运行查询。

In order to be able to read the data, we need to create a partition on our Athena table, and to do that, run another query using the template as follows (note the fields that need to be replaced).

为了能够读取数据,我们需要在Athena表上创建一个partition ,然后使用以下模板运行另一个查询(注意需要替换的字段)。

Now you can read your data using standard SQL syntax directly from your S3 bucket, by simply running the following script.

现在,您只需运行以下脚本,就可以直接从S3存储桶中使用标准SQL语法读取数据。

Athena is a powerful serverless service and can be very useful for tons of use cases. Athena can process unstructured data sets, columnar data formats such as Apache Parquet. You can also use Athena to generate reports and to have some integrations with your BI tools.

雅典娜是一项功能强大的无服务器服务,对于大量的用例来说非常有用。 雅典娜可以处理非结构化数据集,列式数据格式,例如Apache Parquet。 您还可以使用Athena生成报告并与您的BI工具进行一些集成。

我们建立了什么 (What we have built)

I hope this walkthrough has helped you to implement and understand some concerns about networking security. Plus, it can be a kick-off to further investigations and easy-implementations about security best practices.

我希望本演练有助于您实现并了解有关网络安全性的一些问题。 另外,它可以作为对安全最佳实践进行进一步调查和轻松实施的起点。

We have played around with lots of VPC Flow Logs integrations, now you have a professionally monitored network environment.

我们已经进行了许多VPC Flow Logs集成,现在您有了一个受专业监控的网络环境。

下一步是什么? (What’s next?)

Take a deep look at Amazon Athena. It is a useful serverless service. Try to have some fun by uploading some files to S3 and reading their data on Athena.

深入了解Amazon Athena。 这是有用的无服务器服务。 通过将一些文件上传到S3并在Athena上读取其数据,尝试获得一些乐趣。

Play with CloudWatch Insights as well, but do not worry so much in becoming a pro, there are some useful queries over the internet, also, those sample queries offered by AWS may be enough for tons os use cases.

也可以使用CloudWatch Insights,但不必担心成为专家,互联网上存在一些有用的查询,而且,AWS提供的那些示例查询可能足以满足大量使用案例。

参考和有用的链接 (References and useful links)

翻译自: https://codeburst.io/network-monitoring-with-aws-vpc-flow-logs-and-amazon-athena-de94969f4175

aws vpc日志

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值