github api 测试_使用GitHub操作在REST API上进行集成测试

github api 测试

In this post, I’m going to show you how to configure a GitHub action as a continuous integration pipeline for a REST API.

在本文中,我将向您展示如何将GitHub操作配置为REST API的持续集成管道。

动机 (Motivation)

Andrés, a friend of mine, gave a talk about testing REST APIs, using Postman’s test scripts. For his talk, He decided to include a demo which uses an aside project I have. This project is small and it’s for personal use only (I still need to improve a lot of things), but receiving help for adding some integration tests was something I could not let pass. So I joined efforts with Andrés, and while he worked on writing the tests I worked on getting a basic CI pipeline for showcasing them.

我的朋友安德烈斯(Andrés) 谈到了使用Postman的测试脚本测试REST API的话题 。 对于他的演讲,他决定包括一个演示,该演示使用我已有的项目 。 这个项目很小,并且仅用于个人用途(我仍然需要改进很多东西),但是我无法通过增加集成测试的帮助。 因此,我与Andrés共同努力,在他编写测试的过程中,我致力于获得用于展示它们的基本CI管道。

该项目 (The Project)

AHM (Application for hypertension monitoring) is a small project I have been working on for a couple of months now. It aims to provide an easy tracker interface for my blood pressure measurements (blood pressure readings: systolic and diastolic). I started this project with serverless and Golang in mind, but due to an invitation I got from the EDTeam (for which I had to record a Python Course), I then switched it to Python.

AHM ( 高血压监测应用程序 )是一个小项目,我已经从事了几个月。 它旨在为我的血压测量提供一个简单的跟踪器界面(血压读数:收缩压和舒张压)。 我从无服务器和Golang开始这个项目,但是由于我受到EDTeam的邀请(我必须为其录制Python课程),因此我将其切换为Python。

AHM service is far for being production-ready, but regardless, you can get familiar with its architecture and implementation checking the repository’s README. For this post, I rather want your attention to be only on a couple of endpoints that the API offers:

AHM服务远远不适合生产,但无论如何,您可以通过检查存储库的README来熟悉其架构和实现。 对于本文,我希望您的注意力仅集中在API提供的几个端点上:

Image for post
Endpoints offered by the AHM service
AHM服务提供的端点

The first and third endpoints from the image above, are the ones that the integration tests will cover to a limited extent. Before moving onto the tests themselves, it is also important to be aware of the stack on which the API will run:

上图中的第一个和第三个端点是集成测试将在有限范围内覆盖的端点。 在进行测试本身之前,了解API将在其上运行的堆栈也很重要:

  • MongoDB: It’s in MongoDB’s documents where the data (measurements) will be.

    MongoDB :数据(度量)将在MongoDB的文档中。

  • Python3 with Flask-RESTful: This great Flask based framework helps me in guaranteeing that the application is REST compliant.

    带有Flask-RESTful的 Python3:基于Flask的出色框架帮助我保证了应用程序符合REST。

测试 (The Tests)

Let’s start this section by defining what integration tests are. Software testing fundamentals localize integration testing as the next layer of testing upon the unit ones. They define them as:

让我们从定义什么是集成测试开始本节。 软件测试基础知识将集成测试本地化为单元测试的下一层测试。 他们将它们定义为:

A level of software testing where individual units are combined and tested as a group. The purpose of this level of testing is to expose faults in the interaction between integrated units.

一种软件测试级别,其中各个单元组合在一起并作为一组进行测试。 此测试级别的目的是暴露集成单元之间交互中的错误。

The units part I want to combine and test are basically the data layer (the database) and the API layer. Therefore, by having integration tests in place, what I want to validate is that data is being retrieved from the database to the API Layer, and that data is being sent from the API Layer to be stored in the database.

我要组合和测试的单元部分基本上是数据层(数据库)和API层。 因此,通过进行集成测试,我要验证的是正在从数据库中检索数据到API层,并且正在从API层发送数据以将其存储在数据库中。

For the scope of Andres’s talk, there were only two endpoints covered:

在Andres演讲的范围内,仅涵盖了两个端点:

  • GET /v1/measurement: This endpoint lists the latest 10 readings recorded by a user.

    GET / v1 / measurement:此端点列出了用户记录的最新10个读数。
  • POST /v1/measurement: This endpoint allows us to create a reading for a user.

    POST / v1 / measurement:此端点使我们可以为用户创建读数。

The tests were implemented using Postman’s Scripts, which allows you to define tests based on the response object provided by the JavaScript’s API that Postman offers. And together with newman, the tests can be run via CLI, meaning you don’t need to have Postman installed for running the collections written with Postman tests scripts.

这些测试是使用Postman的脚本实现的,该脚本允许您基于Postman提供JavaScript API提供的响应对象来定义测试。 并且与newman一起可以通过CLI运行测试,这意味着您无需安装Postman即可运行使用Postman测试脚本编写的集合。

Image for post
Tests for readings list
测试阅读清单
Image for post
Tests for create reading
测试阅读能力

As it can be seen in the two previous images, the tests written were rather simple and demonstratives. For production-ready applications, more exhausts and edge cases covering tests will for sure, be required. Once you get enough tests you can export the Postman collection together with the Postman environment file and run them via CLI. For instance, if you name the exported files as ahm_api_calls.json and ahm_local.json, with Newman already installed, you can run a command similar to:

从前面的两个图像中可以看出,编写的测试相当简单且具有说明性。 对于准备投入生产的应用,肯定会需要更多排气装置和边缘箱来进行测试。 一旦获得足够的测试,就可以将Postman集合与Postman环境文件一起导出并通过CLI运行它们。 例如,如果将已导出的文件命名为ahm_api_calls.jsonahm_local.json,并且已经安装了Newman,则可以运行类似于以下命令:

newman run ahm_api_calls.json -e ahm_local.json

You can check the exported collections I used (I took them from Andrés talk) for the pipeline inside the postman folder.

您可以检查postman文件夹中用于管道的导出集合(我从Andrés谈话中获取了这些集合)。

管道 (The Pipeline)

With the tests in place, and a mechanism for running them via CLI, all it was left to work on, was the actual Pipeline. I could have selected a fresh installation of Jenkins or a SaaS CI version like Travis, but I decided to go with GitHub actions because I wanted to learn how it works and I wanted to learn for which use cases it can be used (As CI for instance). Besides, the source code was already hosted on GitHub, so when you read something like:

有了适当的测试,以及通过CLI运行测试的机制,剩下的就是实际的管道。 我可以选择全新安装的Jenkins或类似Travis的SaaS CI版本,但我决定采用GitHub动作,因为我想学习它的工作原理,并且想了解它可以用于哪些用例(实例)。 此外,源代码已经托管在GitHub上,因此当您阅读类似内容时:

GitHub Actions makes it easy to automate all your software workflows, now with world-class CI/CD. Build, test, and deploy your code right from GitHub. Make code reviews, branch management, and issue triaging work the way you want.

GitHub Actions借助世界一流的CI / CD,可轻松实现所有软件工作流程的自动化。 直接从GitHub构建,测试和部署代码。 以所需的方式进行代码审查,分支管理和问题分类。

It immediately clicks in your brain (at least in mine) that you can have the benefits of automation right there, in the same place where your code is hosted.

它立即在您的大脑中发出声响(至少在我的大脑中发出响声),您可以在托管代码的同一位置在那里享受自动化的好处。

The targeted CI pipeline will be a Github Action Workflow, which can be defined as a YAML file that needs to live inside your GitHub’s repository root folder, in the directory .github/workflows. Pretty much like everything that automates infrastructure nowadays, you’d need to specify inside that YAML, which steps need to be executed for your pipeline to be considered as done. Another important feature is that within a Github workflow, any step may use Docker, so virtually, you’d would able to run any technology that runs already on Docker. Moreover, there is already a Marketplace with tons of actions (steps) that are ready for being used. In fact, the pipeline I built, uses a few of them.

目标CI管道将是Github Action Workflow,可以定义为需要保存在GitHub的存储库根文件夹(位于.github / workflows目录中)中的YAML文件 与如今使基础结构实现自动化的所有事物几乎一样,您需要在该YAML中指定需要执行哪些步骤才能将管道视为完成。 另一个重要功能是,在Github工作流程中,任何步骤都可以使用Docker,因此实际上,您将能够运行已在Docker上运行的任何技术。 而且,已经有一个市场 ,其中有大量准备使用的动作(步骤)。 实际上,我构建的管道使用了其中一些。

Let me now share the final pipeline I came up with, which allows to run Postman Scripts on a Python REST API:

现在,让我分享我想出的最后一个管道,该管道允许在Python REST API上运行邮递员脚本:

GitHub Action Workflow
GitHub动作流程

“On” section

“开启”部分

Controls when the action will run. Triggers the workflow on push or pull request events but only for the master branch.

控制操作何时运行。 在推或拉请求事件上触发工作流,但仅针对master分支。

on:
push:
branches: [ master ]
pull_request:
branches: [ master ]

This section is kind of self-explained; the whole pipeline will be triggered only when a push or pull request to the master branch happens.

这部分是不解自明的; 仅当对master分支的推送或拉取请求发生时,整个管道才会被触发。

“Build” section

“构建”部分

jobs:                         
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [12.x]
mongodb-version: [4.2]
python-version: [3.7]

A workflow run is made up of one or more jobs that can run sequentially or in parallel.

工作流运行由可以顺序或并行运行的一个或多个作业组成。

Here I’m defining on which operating system the workflow will be run, as well as the matrix version of all the subsystems that the integration tests will use. What is amazing about this, is that with the same pipeline, I can test how my application will behave with different versions of MongoDB and different versions of Python. For this example, I’m using only one version of each subsystem, but really nothing stops me from extending the matrix to include other versions.

在这里,我定义了工作流将在哪个操作系统上运行,以及集成测试将使用的所有子系统的矩阵版本。 令人惊奇的是,使用相同的管道,我可以测试应用程序在不同版本的MongoDB和不同版本的Python下的行为。 在此示例中,每个子系统仅使用一个版本,但实际上并没有阻止我将矩阵扩展为包括其他版本。

“Steps” section

“步骤”部分

Steps represent a sequence of tasks that will be executed as part of the job.

步骤表示将作为作业一部分执行的一系列任务。

Within the steps section is where the fun happens. It’s here, in where each specific unit of execution needs to be defined.

在“步骤”部分中,有趣的是发生的地方。 在这里,需要定义每个特定的执行单元。

System dependencies:

系统依赖项:

- name: Git checkout                              
uses: actions/checkout@v2- name: Install Node JS ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}- name: Start MongoDB ${{ matrix.mongodb-version }}
uses: supercharge/mongodb-github-action@1.3.0
with:
mongodb-version: ${{ matrix.mongodb-version }}- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}

These four initial steps take care of installing all the subsystems dependencies required for running the integration tests. In other words, at the end of these steps, I will have an Ubuntu machine ready for being used with the version of Node JS (for running Newman), MongoDB (for the data), and Python (for the API) specified in the matrix version. Something to notice is that none of those steps were implemented by myself, I did borrow them (check the uses command) from the Github Marketplace.

这四个初始步骤负责安装运行集成测试所需的所有子系统依赖项。 换句话说,在这些步骤的最后,我将准备好一台Ubuntu计算机,准备与以下版本中指定的Node JS(用于运行Newman),MongoDB(用于数据)和Python(用于API)的版本一起使用。矩阵版本。 需要注意的是,这些步骤都不是我自己执行的,我确实从Github Marketplace借用了它们(检查uses命令)。

Package dependencies:

软件包依赖关系:

- name: Install Python dependencies 
run: |
python -m pip install --upgrade pip
if [ -f api/requirements.txt ]; then pip install -r api/requirements.txt; fi- name: Install Newman dependency
run: npm install --prefix tests/postman/

The next two steps defined in the workflow, take care of the package dependencies installation, for both the API written in Python, and for the Newman package that lives on the NPM registry. As I’m not using any third party steps but my owns, there is not uses section but instead, there is the run section, in which one can put bash instructions for defining the step itself.

工作流中定义的接下来的两个步骤,是使用Python编写的API以及位于NPM注册表中的Newman软件包,都要进行软件包依赖项的安装。 由于我没有使用任何第三方步骤,而是由我自己使用 ,因此没有使用部分,而是有运行部分,在其中可以放置bash指令来定义步骤本身。

Executing the tests:

执行测试:

- name: Run the API and Postman's tests                        
run: |
cd api && flask run &
sleep 2
cd tests/postman/ && ./node_modules/newman/bin/newman.js run ahm_api_calls.postman_collection.json -e ahm-local.postman_environment.json
kill -9 `lsof -i:5000 -t`
env:
FLASK_ENV: development
API_HOST: 0.0.0.0
FLASK_APP: main.py
DB_NAME: ahm
DB_HOST: localhost
CI: true

This was the hardest step to implement, and I have to admit, the strategy I used for it, is hacky (but if you can think in something better, let me know). Why is that? Well, all the steps in my pipeline need to be run sequentially, one needs to finish before the other one can be started. That brought me to the question: How can I then run the API and run the tests on independent steps? I could not find an elegant solution for it, so I decided them to run both the API, and the tests within one step, and to use my old friends, the sleep and kill commands for managing a smoothly step termination. The idea behind this step was:

这是最难实现的步骤,我必须承认,我使用的策略很棘手(但是,如果您能想到更好的方法,请告诉我)。 这是为什么? 好了,我管道中的所有步骤都需要按顺序运行,一个步骤需要先完成,然后才能开始另一个步骤。 这使我想到了一个问题:我如何才能运行API并在独立步骤上运行测试? 我找不到合适的解决方案,所以我决定让他们在一步之内运行API和测试,并使用我的老朋友的sleepkill命令来管理平滑的步骤终止。 此步骤的想法是:

  • I run the API in the background, and wait for 2 seconds, giving it enough time to boost and run.

    我在后台运行该API,并等待2秒钟,以使其有足够的时间进行增强和运行。
  • I then run the collections via Newman. If they fail, the step will be considered failed, and the error will show up on the Github UI.

    然后,我通过Newman运行这些集合。 如果失败,则该步骤将被视为失败,并且该错误将显示在Github UI上。
  • If the tests pass, I now need to kill the API so the step can be considered as successfully ended.

    如果测试通过,我现在需要终止该API,因此该步骤可以视为已成功结束。

Not the most elegant solution of course, but at least, a solution that can be explain it with enough simplicity. Something additional to notice, is that a step can also inject variables in the environment, so all the ones that the API needs, were passed statically via the env section.

当然,这不是最优雅的解决方案,但至少可以用足够的简单性对其进行解释。 另外需要注意的是,步骤还可以在环境中注入变量,因此API所需的所有变量都通过env节静态传递。

The pipeline in action:

运行中的管道:

Image for post
The pipeline in action
运作中的管道

If you want to check one of the logs coming from the implemented pipeline you can revise the actions page for the repository:

如果要检查来自已实现管道的日志之一,则可以修改存储库的操作页面:

In there, you can notice all of the steps that were executed and what the results were for each of them. There are also plenty of failed runs that you can revise and learn from them; ones because of security (I don’t want to reach the free quotas) and others that were run while I got the final implementation.

在这里,您可以注意到已执行的所有步骤以及每个步骤的结果。 您还可以修改很多失败的运行并从中学习; 一个是出于安全性考虑(我不想达到免费配额),另一个是在获得最终实现时运行的。

结论 (Conclusions)

GitHub Actions can be used for building CI and CD pipelines upon Github repositories. It is even possible to include integration testing, which implies having all subsystem dependencies in place (installed and running).

GitHub Actions可用于在Github存储库上构建CI和CD管道。 甚至可以包括集成测试,这意味着具有所有子系统相关性就位(已安装并正在运行)。

As a TL, if your CI pipeline is “simple enough”, you don’t really need to go and implement it outside of Github, all the software life cycle can be now implemented within the same space with Github actions, which I think is great, because your focus and energy could be localized on one tool only.

作为TL,如果您的CI管道“足够简单”,那么您真的不需要在Github之外实施它,那么现在可以使用Github动作在同一空间内实现所有软件生命周期,我认为这是很好,因为您的注意力和精力只能集中在一种工具上。

I also believe this can give some level of freedom to the QA automation crews, which more often than not, find themselves begging to their TLs for getting some sort of sandbox that allows them to run integration or end to end testing.

我还相信,这可以给QA自动化人员提供一定程度的自由,而他们常常发现自己向TL乞求以获得某种沙箱,从而可以进行集成或端到端测试。

With the amount of actions already implemented and available on the marketplace, most of the regular use cases should be already covered, so try finding an action first, before implementing any on your own.

由于已经实施并在市场上可以使用的操作数量众多,因此大多数常规用例都应该已经涵盖了,因此请尝试先查找操作,然后再自行执行操作。

Finally, GitHub actions might be a nice way of getting started with Open Source. I, for instance, filed a feature request for one of the actions I used in my pipeline: https://github.com/supercharge/mongodb-github-action/issues/12.

最后,GitHub动作可能是入门的好方法。 例如,我对管道中使用的其中一项操作提出了功能请求: https : //github.com/supercharge/mongodb-github-action/issues/12

翻译自: https://medium.com/weekly-webtips/using-github-actions-for-integration-testing-on-a-rest-api-358991d54a20

github api 测试

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值