4. AWS Lambda

Overview

  • Lambda is a compute service that lets you run code without provisioning or managing servers.
  • Lambda runs your code on a high-availability compute infrastructure and performs all of the administration of the compute resources, including server and operating system maintenance, capacity provisioning and automatic scaling, code monitoring and logging.
  • With Lambda, you can run code for virtually any type of application or backend service.
  • You can invoke your Lambda functions using the Lambda API, or Lambda can run your functions in response to events from other AWS services. 
  • you cannot log in to compute instances or customize the operating system on provided runtimes

Lambda concepts

  • function is a resource that you can invoke to run your code in Lambda. 
  • trigger is a resource or configuration that invokes a Lambda function
  • An event is a JSON-formatted document that contains data for a Lambda function to process. 
  • An execution environment provides a secure and isolated runtime environment for your Lambda function. An execution environment manages the processes and resources that are required to run the function.
  • You deploy your Lambda function code using a deployment package. Lambda supports two types of deployment packages:
    • A .zip file archive that contains your function code and its dependencies. Lambda provides the operating system and runtime for your function.
    • A container image: You add your function code and dependencies to the image. You must also include the operating system and a Lambda runtime.
  • The runtime provides a language-specific environment that runs in an execution environment. 
  • A Lambda layer is a .zip file archive that can contain additional code or other content. 
    • Layers provide a convenient way to package libraries and other dependencies that you can use with your Lambda functions.
    • Using layers reduces the size of uploaded deployment archives and makes it faster to deploy your code.
    • You can include up to five layers per function.
    • Functions deployed as a container image do not use layers. 
  • Lambda extensions enable you to augment your functions. 
    • An internal extension runs in the runtime process and shares the same lifecycle as the runtime.
    • An external extension runs as a separate process in the execution environment. The external extension is initialized before the function is invoked, runs in parallel with the function's runtime, and continues to run after the function invocation is complete.
  • Concurrency is the number of requests that your function is serving at any given time.
  • When you invoke or view a function, you can include a qualifier to specify a version or alias.
    • version is an immutable snapshot of a function's code and configuration that has a numerical qualifier. For example, my-function:1.
    • An alias is a pointer to a version that you can update to map to a different version, or split traffic between two versions. 

Lambda features

  • Authoring specifics vary between runtimes, but all runtimes share a common programming model that defines the interface between your code and the runtime code. You tell the runtime which method to run by defining a handler in the function configuration, and the runtime runs that method
  • Lambda manages the infrastructure that runs your code, and scales automatically in response to incoming requests.
  • Use concurrency settings to ensure that your production applications are highly available and highly responsive. 
    • To prevent a function from using too much concurrency, and to reserve a portion of your account's available concurrency for a function, use reserved concurrency. 
      • Other functions can't prevent your function from scaling: When a function has reserved concurrency, no other function can use that concurrency.
      • Your function can't scale out of control: Reserved concurrency also limits the maximum concurrency for the function, and applies to the function as a whole, including versions and aliases.
    • To enable functions to scale without fluctuations in latency, use provisioned concurrency. For functions that take a long time to initialize, or that require extremely low latency for all invocations, provisioned concurrency enables you to pre-initialize instances of your function and keep them running at all times.
  • When you invoke a function, you can choose to invoke it synchronously or asynchronously.
    • For asynchronous invocations, Lambda handles retries if the function returns an error or is throttled.
  • To process items from a stream or queue, you can create an event source mapping.
    • An event source mapping is a resource in Lambda that reads items from an Amazon Simple Queue Service (Amazon SQS) queue, an Amazon Kinesis stream, or an Amazon DynamoDB stream, and sends the items to your function in batches.
    • Each event that your function processes can contain hundreds or thousands of items.
    • Event source mappings maintain a local queue of unprocessed items and handle retries if the function returns an error or is throttled.
  • A destination is an AWS resource that receives invocation records for a function.

AWS Lambda permissions

  • AWS Lambda execution role
    • A Lambda function's execution role is an AWS Identity and Access Management (IAM) role that grants the function permission to access AWS services and resources.
    • You provide this role when you create a function, and Lambda assumes the role when your function is invoked
  • Resource-based policies
    • Resource-based policies let you grant usage permission to other AWS accounts on a per-resource basis.
    • You also use a resource-based policy to allow an AWS service to invoke your function on your behalf.
  • Identity-based IAM policies
    • You can use identity-based policies in AWS Identity and Access Management (IAM) to grant users in your account access to Lambda.
    • You can also grant users in another account permission to assume a role in your account and access your Lambda resources.
  • permissions boundaries
    • The permissions boundary limits the scope of the execution role is that the application's template creates for each of its functions, and any roles that you add to the template.
    • You can add permissions to a function's execution role in the template, but that permission is only effective if it's also allowed by the permissions boundary. 
    • To access other resources or API actions, you or an administrator must expand the permissions boundary to include those resources.

Managing AWS Lambda functions

  • environment variables
    • To keep secrets out of your function code, store them in the function's configuration and read them from the execution environment during initialization.
    • Use environment variables to make your function code portable by removing connection strings, passwords, and endpoints for external resources.
    • Lambda stores environment variables securely by encrypting them at rest.
  • Versions and aliases are secondary resources that you can create to manage function deployment and invocation. 
  • Use layers to manage your function's dependencies independently and keep your deployment package small. 
  • To establish a private connection between your VPC and Lambda, create an interface VPC endpoints. Interface endpoints are powered by AWS PrivateLink, which enables you to privately access Lambda APIs without an internet gateway, NAT device, VPN connection, or AWS Direct Connect connection. Traffic between your VPC and Lambda does not leave the AWS network.
  • You can configure a function to mount an Amazon Elastic File System (Amazon EFS) file system to a local directory. With Amazon EFS, your function code can access and modify shared resources safely and at high concurrency.
  • Code signing for AWS Lambda helps to ensure that only trusted code runs in your Lambda functions. 

Invoking AWS Lambda functions

  • S ynchronous invocation
    • When you invoke a function synchronously, Lambda runs the function and waits for a response
    • When you invoke a function directly, you can check the response for errors and retry. The AWS CLI and AWS SDK also automatically retry on client timeouts, throttling, and service errors.
  • Asynchronous invocation
    • When you invoke a function asynchronously, you don't wait for a response from the function code. You hand off the event to Lambda and Lambda handles the rest. 
    • For asynchronous invocation, Lambda places the event in a queue and returns a success response without additional information. 
    • Lambda manages the function's asynchronous event queue and attempts to retry on errors
    • You can also configure Lambda to send an invocation record to another service. Lambda supports the following destinations for asynchronous invocation.
      • Amazon SQS – A standard SQS queue.
      • Amazon SNS – An SNS topic.
      • AWS Lambda – A Lambda function.
      • Amazon EventBridge – An EventBridge event bus. 
    • The invocation record contains details about the request and response in JSON format. You can configure separate destinations for events that are processed successfully, and events that fail all processing attempts. Alternatively, you can configure an SQS queue or SNS topic as a dead-letter queue for discarded events. For dead-letter queues, Lambda only sends the content of the event, without details about the response.
  • event source mappings:

AWS Lambda applications

  • An AWS Lambda application is a combination of Lambda functions, event sources, and other resources that work together to perform tasks. 
  • The AWS Serverless Application Repository provides a collection of Lambda applications that you can deploy in your account with a few clicks. 
  • With continuous delivery, every change that you push to your source control repository triggers a pipeline that builds and deploys your application automatically. 

Execution environment

  • Lambda invokes your function in an execution environment, which provides a secure and isolated runtime environment.
  • The execution environment manages the resources required to run your function. 
  • The function's runtime communicates with Lambda using the Runtime API. Extensions communicate with Lambda using the Extensions API.

Architecture diagram of the execution environment.

  • The lifecycle of the execution environment includes the following phases:
  • Init:In this phase, Lambda creates or unfreezes an execution environment with the configured resources, downloads the code for the function and all layers, initializes any extensions, initializes the runtime, and then runs the function’s initialization code (the code outside the main handler). 
    • Start all extensions (Extension init)
    • Bootstrap the runtime (Runtime init)
    • Run the function's static code (Function init)
  • Invoke: In this phase, Lambda invokes the function handler. After the function runs to completion, Lambda prepares to handle another function invocation.
  • Shutdown:This phase is triggered if the Lambda function does not receive any invocations for a period of time.

Monitoring

  • Lambda automatically monitors Lambda functions on your behalf and reports metrics through Amazon CloudWatch.
  • Lambda automatically integrates with CloudWatch Logs and pushes all logs from your code to a CloudWatch Logs group associated with a Lambda function, which is named /aws/lambda/<function name>.
  • AWS Trusted Advisor inspects your AWS environment and makes recommendations on ways you can save money, improve system availability and performance, and help close security gaps.

Security

  • Encryption in transit: Lambda API endpoints only support secure connections over HTTPS.

  • Encryption at rest: You can use environment variables to store secrets securely for use with Lambda functions. Lambda always encrypts environment variables at rest.

AWS Serverless Application Model

  • The AWS Serverless Application Model (AWS SAM) is an open-source framework that you can use to build serverless application on AWS.
  • serverless application is a combination of Lambda functions, event sources, and other resources that work together to perform tasks. 
  • You use the AWS SAM specification to define your serverless application.
  • AWS SAM templates are an extension of AWS CloudFormation templates, with some additional components that make them easier to work with.
  • A serverless application can include one or more nested applications. You can deploy a nested application as a stand-alone artifact or as a component of a larger application.

AWS Serverless Application Repository

  • The AWS Serverless Application Repository makes it easy for developers and enterprises to quickly find, deploy, and publish serverless applications in the AWS Cloud.
  •  With the AWS Serverless Application Repository, you can:

    • Publishing Applications – Configure and upload applications to make them available to other developers, and publish new versions of applications.

    • Deploying Applications – Browse for applications and view information about them, including source code and readme files. Also install, configure, and deploy applications of your choosing.

Reference

https://docs.aws.amazon.com/lambda/latest/dg/welcome.html

https://docs.aws.amazon.com/serverlessrepo/latest/devguide/what-is-serverlessrepo.html

https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值