aws terraform_使用Terraform从Pipfile自动在AWS Lambda层进行python库部署

本文介绍了如何通过Terraform自动化从Pipfile部署Python库到AWS Lambda的层,提高Lambda函数的效率和管理便捷性。
摘要由CSDN通过智能技术生成

aws terraform

AWS Lambda, a part of Amazon Web Services (AWS) is a serverless computing service that works as FaaS (Function as a Service). A FaaS is a service which provides users to develop and manage application service without thinking about infrastructure.

AWS Lambda是Amazon Web Services(AWS)的一部分,是一种无服务器计算服务,用作FaaS(功能即服务) 。 FaaS是一项服务,可为用户提供开发和管理应用程序服务的功能,而无需考虑基础架构。

Terraform is an Infrastructure as Code (IaC) tool which is developed by Hasi Corp to manage resources of cloud services like AWS, Google Cloud, Azure, etc. It is open-source and developed by golang.

TerraformHasi Corp开发的基础结构即代码( IaC )工具,用于管理AWS,Google Cloud,Azure等云服务的资源。它是开源的,由golang开发。

It is always challenging to zip the codes and upload them for AWS Lambda every time at the time of deployment. The more complex part is to upload codes of libraries e.g python libraries. At Craftsmen, we need to manage a lot of lambdas for various development purposes. So a smart solution for uploading lambda function code and libraries while deployment is a crying need.

每次部署时,每次压缩邮政编码并为AWS Lambda上载它们始终是一个挑战。 更复杂的部分是上传库代码,例如python库。 在Craftsmen中 ,我们需要为各种开发目的管理大量的lambda。 因此,在部署时迫切需要一种用于上传lambda函数代码和库的智能解决方案。

Our approach is to upload function codes as function level and libraries in lambda layer. The reasons are 1. Share python libraries between lambdas2. The console code editor can only visualize 3MB of code. So uploading libraries can make a chance to see codes in console editor.

我们的方法是在lambda层中将功能代码作为功能级别和库上载。 原因是1.在lambdas2之间共享python库。 控制台代码编辑器只能显示3MB的代码 。 因此,上载库可以使您有机会在控制台编辑器中查看代码。

设置项目 (Setting the project)

Let’s start by setting the project skeleton. We are going to use pipenv because it’s more developer-friendly to maintain dev and release packages. First, we install pipenv from here. Then we will install terraform from here

让我们从设置项目框架开始。 我们将使用pipenv,因为它对开发人员和开发人员来说都更易于维护。 首先,我们从这里安装pipenv。 然后我们从这里安装terraform

# Create a project directorymkdir lambda-with-terraform
cd lambda-with-terraform
# Create lambda code directorymkdir lambda# Create Terraform directorymkdir terraform# Add handler.py file in lambda directorytouch lambda/handler.py# Add Pipfile in project root directorytouch Pipfile# So our skeleton will look like thistree├── lambda
│ └── handler.py
├── Pipfile
└── terraform

添加python库 (Add python libraries)

We will only use a single library called requests in [packages] and pipenv at [dev-packages]. Also, we are going to use python 3.8. Let’s add all to Pipfile.

我们将只使用一个称为请求的 在[packages]中,在pipenv在[dev-packages]中。 另外,我们将使用python 3.8 。 让我们全部添加到Pipfile

# Pipfile[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true[dev-packages]
pytest = "==5.3.5"
pipenv = "==2020.8.13"[packages]
requests = "==2.23.0"[requires]
python_version = "3.8"

Initiate python virtual environment by pipenv with

通过pipenv使用以下命令初始化python虚拟环境

pipenv install

This will add the Pipfile.lock file which contains information on all python packages.

这将添加Pipfile.lock文件,其中包含有关所有python软件包的信息。

添加功能代码 (Add function code)

Let’s start with a simple lambda function which just gets a web page and log it.

让我们从一个简单的lambda函数开始,该函数仅获取一个网页并对其进行记录。

# handler.pyimport requests
import loggingLOGGER = logging.getLogger()
LOGGER.setLevel(logging.INFO)def lambda_handler(event, context):
response = requests.get("https://example.com")
LOGGER.info(response.text)

添加Terraform代码 (Add Terraform code)

At first, we will add some files for Terraform

首先,我们将为Terraform添加一些文件

# Create file for lambda terraform codestouch terraform/lambda.tf# Create file for lambda layer terraform codestouch terraform/lambda_layer.tf# Create python file for make pip requirements file from Pipfiletouch terraform/requirements_creator.py# Add shell script to generate pip requirements and make zip file of # lambda librariestouch terraform/build_layer.sh
chmod a+x terraform/build_layer.sh

In requirements_creator.py, a python argument parser will be added which gets the filename of pip requirements e.g. requirements.txt

requirements_creator.py中 ,将添加一个python参数解析器,该解析器获取pip需求的文件名,例如requirements.txt

# terraform/requirements_creator.pyimport argparse
from pipenv.project import Project
from pipenv.utils import convert_deps_to_pip
def _make_requirements_file(file_path):
pipfile = Project(chdir=False).parsed_pipfile
requirements = convert_deps_to_pip(pipfile['packages'], r=False)
with open(file_path, 'w') as req_file:
req_file.write('\n'.join(requirements))
def run():
parser = argparse.ArgumentParser()
parser.add_argument(
'--file_path',
'-file_path',
type=str,
default='requirements.txt'
)
args = parser.parse_args()
_make_requirements_file(args.file_path)
if __name__ == "__main__":
run()

Now let’s add shell script to generate a zip file for lambda layer with python libraries

现在让我们添加外壳脚本以使用python库为lambda层生成一个zip文件

# terraform/build_layer.shDESTINATION_DIR=${DESTINATION_DIR:-$PWD}
MODULE_DIR=${MODULE_DIR:-$PWD}
ZIPFILE_NAME=${ZIPFILE_NAME:-layer}
echo "Module dir $MODULE_DIR"
echo "Destination dir $DESTINATION_DIR"
TARGET_DIR=$DESTINATION_DIR/$ZIPFILE_NAME
echo "Target dir $TARGET_DIR"
mkdir -p "$TARGET_DIR"REQUIREMENTS_FILE_PATH=$MODULE_DIR/requirements.txt
python3 "$MODULE_DIR"/requirements_creator.py --file_path "$REQUIREMENTS_FILE_PATH"pip install -r "$REQUIREMENTS_FILE_PATH" -t "$TARGET_DIR"/python
(cd "$TARGET_DIR" && zip -r "$DESTINATION_DIR"/"$ZIPFILE_NAME".zip ./* -x "*.dist-info*" -x "*__pycache__*" -x "*.egg-info*")rm "$REQUIREMENTS_FILE_PATH"
rm -r "$TARGET_DIR"

Now add lambda layer terraform code

现在添加lambda层terraform代码

// terraform/lambda_layer.tflocals {
// All lambda codes zip and layer zip file directory
lambda_artifact_dir = "${path.module}/lambda_zip"
lambda_layer_zipfile_name = "layer"
python_version = "python${data.external.python_version.result.version}"
}// Grab python version from Pipfile. Default is 3.8 if not mentioned // in Pipfiledata "external" "python_version" {
program = [
"python3",
"-c",
"from pipenv.project import Project as P; import json; _p = P(chdir=False); print(json.dumps({'version': _p.required_python_version or '3.8'}))"
]
}// Generate zipfile for lambda layerresource "null_resource" "build_lambda_layer" {
provisioner "local-exec" {
when = create
command = "./${path.module}/build_layer.sh"
environment = {
DESTINATION_DIR = abspath(local.lambda_artifact_dir)
MODULE_DIR = abspath(path.module)
ZIPFILE_NAME = local.lambda_layer_zipfile_name
}
}
triggers = {
// Trigger only when something changes in Pipfile
run_on_pipfile_change = filemd5("${abspath(path.module)}/../Pipfile")
}
}
resource "aws_lambda_layer_version" "lambda_layer" {
filename = "${local.lambda_artifact_dir}/${local.lambda_layer_zipfile_name}.zip"
layer_name = "lambda_layer"
compatible_runtimes = [local.python_version]
// It will run after lambda layer zipfile build
depends_on = [null_resource.build_lambda_layer]
lifecycle {
create_before_destroy = true
}
}

Finally, we are going to add lambda terraform code

最后,我们将添加lambda terraform代码

// terraform/lambda.tf// Zip lambda function codesdata "archive_file" "lambda_zip_file" {
output_path = "${local.lambda_artifact_dir}/lambda.zip"
source_dir = "${path.module}/../lambda"
excludes = ["__pycache__", "*.pyc"]
type = "zip"
}
data "aws_iam_policy_document" "lambda_assume_role" {
version = "2012-10-17"
statement {
sid = "LambdaAssumeRole"
effect = "Allow"
actions = [
"sts:AssumeRole"
]
principals {
identifiers = [
"lambda.amazonaws.com"
]
type = "Service"
}
}
}// Lambda IAM roleresource "aws_iam_role" "lambda_role" {
name = "test-lambda-role"
assume_role_policy = data.aws_iam_policy_document.lambda_assume_role.json
lifecycle {
create_before_destroy = true
}
}// Lambda function terraform code
resource "aws_lambda_function" "lambda_function" {
function_name = "test-lambda-function"
filename = data.archive_file.lambda_zip_file.output_path
source_code_hash = data.archive_file.lambda_zip_file.output_base64sha256
handler = "handler.lambda_handler"
role = aws_iam_role.lambda_role.arn
runtime = local.python_version
layers = [aws_lambda_layer_version.lambda_layer.arn]
lifecycle {
create_before_destroy = true
}
}

测试时间 (Time to test)

To test if everything works, we have to add Terraform provider. In our case the provider is AWS. Let’s add a file provider.tf in terraform directory

为了测试一切是否正常,我们必须添加Terraform provider。 在我们的情况下,提供者是AWS。 让我们在terraform目录中添加文件provider.tf

# terraform/provider.tf
provider "aws" {
region = "eu-west-1"
profile = "aws-profile-name-from-aws-config-file-at-your-machine"
}

The project will look like

该项目看起来像

.
├── lambda
│ └── handler.py
├── Pipfile
├── Pipfile.lock
└── terraform
├── build_layer.sh
├── lambda_layer.tf
├── lambda.tf
├── provider.tf
└── requirements_creator.py

Let’s build infrastructure 😹

建立基础架构😹

# Activate pipenv virtual environmentpipenv shell# Go to terraform directorycd terraform# Initialize terraform terraform init# Check terraform infrastructure component to deployterraform plan# And deploy withterraform apply# If you want to destroy allterraform destroy

翻译自: https://medium.com/craftsmenltd/automate-python-libraries-deployment-at-aws-lambda-layer-from-pipfile-with-terraform-d28de0eb765f

aws terraform

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值