Visualize and design your architecture with AWS Application Composer

Hello, all. Welcome. Welcome to SVS 213, Visualize and Design Your Architecture Using AWS Application Composer as a Solutions Architect. My day job is to work with customers like you. And one key question which always comes in with any customer conversation, even when I was talking to customers here is that you all have amazing understanding of your requirements and your end customers requirements. And question you ask is that I have this idea of a web application, mobile application or I have a, you know, a solution idea. How do I take that idea quickly, build a solution and deliver it to the end customers doorstep as quickly and as efficiently as possible, thus delivering the customer value and customer satisfaction and hence the business growth.

And in this, in this talk today, we're gonna, I'm going to show you how you can take that concept to a customer as quickly as possible.

My name is Nin Gaa. You can call me Nain. Let's get started.

A Application Composer is a tool with which you can visually compose configure and connect serverless resources. And by doing so, it will give you infrastructure as a code which is deployment ready and follows aws best practices.

We will now take an idea of a customer requirement. For example, document summary app. Let's over the course of the next 15 minutes. Let's adopt this as our requirement saying that you have tons of documents. It could be several 100 lines or maybe you know, hundreds of pages. Let's build an application which leverages amazon bedrock or an a i model to summarize that document and save that in a persistent database so that we can also access it using either a mobile application or a web application using an api call.

So if we switch back to our demo, let's actually see things in action and take this idea into reality. Ok?

This is my aws console. I'm going to access application composer. Ok? Either you can create a demo project or create a new project. So I'm going to create a new project. Ok?

This is the application composer console page where on the left side or right side of your guys, I think you can see resources. There are two kind of resource cards, one is enhanced component and other thing is a standard ioc resource. So enhanced component could be a combination of one or multiple cloud formation resources. And that's why it's called a component. And whereas an standard resource is a cloud formation resource itself.

And on the right side here we have a visual canvas where we can drag and drop this resource cards and connect them together. And while doing that, it's going to give us a template which we can save into our local file system if needed. We can check into our cd, you get up repositories and let pipeline do the rest for us, right?

So what i'm doing is that i want whatever i'm doing on the amazon a console here to save in our local file system so that i can check into my version control system or deploy it, right?

So i'm going to go to menu, i'm going to activate local sync. So what it is asking is it is asking session based, limited permission for this browser window to access my folder so that it can save the files in there. Ok? I'm going to create a new f called document summary select and i'm giving it real permissions and also it's asking, ok. are you ok for me to save files, right? So i'm going to save files there. Perfect. Now, auto save is enabled. So whatever we are doing it, it's going to save it in my local file system as well. Ok?

If you go back to our design, you know which we have referred to. You see there could be two floors. As you can imagine the sur architectures can be either a synchronous wherein you submit a job and let it do the rest in the background and the synchronous call for our api call where you are accessing through a mobile or a web application. You want the results instantaneously that is a synchronous flow.

Let's create both of those flows in our application for our asynchronous flow. Let's start with an s3 simple storage service s3 bucket. So i'm going to drag and drop in my canvas. This s3 bucket. If i click on details, we can name it, document summary s3. And if you notice by default, it is blocking the public access to this s3 bucket. And also if i go to the template, we see that it is following the best practices of saving or encrypting the data at rest. And if you scroll down and also implemented a policy where it is denying the traffic in transit, which is not encrypted as well, right? So it's following the best practices there. Ok?

We want to save once a user uploads an object or our documents in our s3 bucket, we want them to be analyzed by some compute for which let's leverage amazon lambda, ok? I'm going to drag and drop the lambda function. Let's call it process files, lambda. Ok?

Now you see that each resource card has properties. It may not be fully exhaustive list of all the supported properties, but it gives you defaults for minimum properties so that you can get started. And if you want additional properties, you can always go to the template and change them. Ok?

For my package type, i'm going to leave the zip and let me call the source source path as process lambda for my run time. Let me select python. Ok? You can select the other properties like architecture. You can change the memory time out for the lambda function enable provision or reserve concurrency, et cetera. Let's click, save on this one. Ok.

Now you may wonder that each resource card has two dots or two ports or what we call one on the left side and one on the right side, right? So one on the left indicates what event can trigger or access this s3 bucket and one on the right indicates what this particular s3 bucket can trigger as the next action in a workflow.

Now let's create a connection between our s3 bucket and the lambda function. Now as soon as i click here, now you see with the little guesswork, it is telling you that this is a supported method of integration. So as soon as i connect both of them, what it has done is it has added a core saying that for every event of object creation or removal, it's going to trigger that lambda function, right? So which we are ok with because we want this lambda function to be executed when a file or a document is uploaded on our s3 bucket. Ok?

For our persistent database, let's leverage amazon dynamo db. Let me drag and drop the dyna dyna mo db table in my canvas. Let's call it summary dv. You can set the partition key which default i'm ok with, you can also set sort key or expiration, etcetera. Let's keep safe.

Now again, we can again connect both the puts. Now again, as with anything, what it has done is it has automatically added the variables for the lambda function saying that both the lambda function, both the table name and the arn of that resource. So that we can refer in our business logic, which which goes inside the lambda function. And also it has created a dynamo db, create read card policy, create read update and delete policy. So that when lambda function executes it has permissions to access those dynamo db tables which we have created. Perfect.

And for our synchronous flow, let's leverage amazon api gateway. I'm gonna drag and drop api here. Let's call it summary api. Now again, you can set various authorizer or routes like you know, get a summary or get summary by id, et cetera and said, you know, if you want to refer an external a p documentation, et cetera, you can do so. Ok, let's save that.

Similarly, let's again, get a lambda function to process our api requests. I'm going to call it get summary api. Ok. Now if you notice here, it's because we have selected our run time in for a previous lambda, it automatically remembers that it's also applying the same thing for our new lambda function. We just got, ok.

Let's connect all these resources. Perfect. So we have both synchronous and asynchronous flows created. But one thing we haven't done so far is that our process lambda api or process lambda function needs access to bedrock as well. So let's go ahead and create permissions for that for that.

I'm going to, that's i'm going to use standard i resource to create an additional policy and assign it to our process lambda function. So i'm going to drag and drop the managed policy here. Unlike our components, enhanced components for our standard iac resource, we have to give it the configuration as adjacent format, right?

So let's call this as bedrock access and let's get um our resource configuration which i have done it here. Perfect. And let's also refer in our lambda function so that it knows or it is a sign. Ok? Perfect. So our design is done.

So is our infrastructure as a template which is already created and synced in our local file system. Let's go ahead and deploy this architecture. Ok?

So this is my id. Let me open up the new folder which we have created you called it. Uh yeah, document summary. Ok. So as you see, we have the template, our lambda functions, et cetera already there in our system. Ok.

To deploy this, what i'm going to do is i'm going to use aws sam service application model, which is a combination of ac and the shorthand notation for defining serve as resources to deploy this template which we have. So i'm using s a accelerated command called sam sync. I'm giving it a stack name or a cloud formation stack name. We're calling it summary app demo. Let's select a region. Perfect.

Let's go ahead and see what it does. So as soon as i hit enter, what it's doing is it's validating the template for any syntax issues or anything. It's compiling our lambda functions. It's also creating those artifacts and deploying via cloud formation stacks.

So while it is deploying it, what i have done is i've already pre created a stack with the exact same configuration. And let's see. Um you know, let's use that to test our architecture. Ok.

So we have this, we have this app and uh alongside the infrastructure as a code template here. What i've done is i've leveraged amazon code whisperer to create or maybe rather will let's call it, generate the business logic for us so that we can test the end to end flow.

For example, all my get summary lambda is doing is it is fetching all the objects or the summary of the documents from the dynamo db table and giving it to the a p as a response. And for our process, lambda, what it's doing is it is basically leveraging bedrock a i model and summarizing summarizing the documents in under 100 words and saving it in our dynamo db table. Perfect.

So i have some sample files of aws services, for example, two where it has like almost maybe 6 700 lines of text or document. Let's go ahead and upload this in the st bucket and let's see what exactly does it do in a summary, right?

So i'm going to use aws c command to sync or upload these documents. Ok? Ok. So the document upload is complete.

Let's go to our aws console and go to dynamo db, right? So we're using it in oregon region tables. This is our summary app table. Ok. Now, if i explore the table items, now you see that the processing already completed for all those documents which we have uploaded. So obviously, the time is set as utc which is in sync with my laptop time zone and let's check one of the results.

Let's see what it has done to our two, let's say, ok. So it's, it's saving it as a time stamp. It is it is reflecting to the object or the document name and it's also giving the summary which we are after. Ok. So it basically says that two offers a wide variety of instance types optimized for various different purposes, et cetera and for our api for our synchronous flow.

So we can either test it from here, we can simply test where it has retrieved all the i mean visually, it's not appealing, but for that, let me trigger it in our response. Yeah. Now you see that with the api get response, it is able to fetch all the results from our back end database for all the documents which we have uploaded. Ok.

So quickly, let me show you one more thing. By no means this is a full requirement for a business application, right? You can simply keep extending this architecture, keep adding more and more features. Like you can leverage aw step functions. For example, if i and you can you know replace this lambda function, the process lambda function with a step state machine or a step function state machine where you can create a workflow where you can within the same application console, you can create a workflow. Maybe if you want, you know when a user objects uploads a document with api you may want to get rid of that p you can create that workflow and state machine within the same application composer console. Here is the remaining formatted transcript:

Now and then both your workflow and your infrastructure as a code in one and single console at the moment. Ok?

You may ask me a question that ok. What about if I have an architecture which is already created in my environment? For example, for that, I've basically picked up, picked up an application from a w samples page. Let me quickly show you.

So i'm opening the project folder for that. So it is a command query response segregation pattern, right? So let me open the template. Really? Ok.

Now you see that even if you have an existing architecture, it is visually showing you what the template is all about how one resource is connected to another. And if you ask me, if anybody of the team asks you, ok, show me your latest architecture diagram for the code which is deployed in your architecture.

Simply come here and export the canvas, share this image and that's your design document right there within the console. Ok?

If we switch back to the presentation mode, I want to take you through some of the resources. Let's go back to the presentation mode. Thank you.

Ok, let's move ahead. Ok. These are some of the useful resources whatever I have presented or if you want to get started with this, you know, serve as journey or application composer journey. You can leverage all the resources from the same page here and you can also for your service learning path. You can access this url s 12 d.com serve learning where you can find all the resources and you can visit us.

I i'm here obviously, if you have any more questions on this session or so you can, you can find me off stage here. And with that, I want to thank you for your time.

Please do take a moment to complete the session survey in AWS events app. That's of immense value to us so that we can bring that content more and more to you. Thank you so much. Have a nice day.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值