Boost developer productivity with Amazon CodeWhisperer

Um hi. Uh welcome to DOP 211 Boosting Developer Productivity with Amazon CodeWhisperer. I'm Joe Kobe, Principal, Go-To-Market Specialist with uh NextGen Developer Experience.

NextGen Developer Experience at AWS covers the services that interact with software developers. With me, I have Jessica Feng, another of our Principal Code AI Specialists and also uh Ron Kimchi, the General Manager of uh Cloud Foundation.

So why are we here today? I think it's been interesting. Uh it's been an interesting journey, understanding what productivity is and what it means to be a productive developer and how we and our customers are thinking about that. And so we're going to talk today about sort of what this idea of productivity is because I'd be willing to bet that if I asked five of you in the audience, what you think a developed productive developer is, we'd probably get five very different answers.

Secondly, once we've explored this idea of productivity, we're going to talk about how AI plays a role in this, right? What is the role that AI is playing in, in either increasing or, or some of the second order effects within productivity once we've had that conversation, um I will hand over to Ron and he's going to talk about his team's journey into using generative AI as part of their development process.

And finally, we'll hand over to, to Jessica and she's going to talk about lessons learned in this process. What have we seen? What have our customers learned about how to get started with these sort of gen AI tools in sort of a developer, productivity space.

So why listen to us, we have the good fortune in our position to spend our pretty much our entire day talking to customers about software development, the software development life cycle, how they think about it, what tooling they're doing? What are the challenges that they see? And so combine that with our um on our, on our team, we, we sort of have history in software development.

Um one of my first jobs was actually writing um a sp.net code, any a sp.net folks out here from the mid nineties. Yeah, thank you. Um and so from that experience and then recently uh as the chief technology officer for the state of Indiana, I had to manage software development and understand how to balance technical debt with net new feature development in a finite budget.

These themes resolve, I think uh revolve pretty well with our customers. The um in this first day, the sort of a asp.net development was early in the days of the web. No one really knew kind of what was possible. And so once I built the site was able to rewrite it three times in a year, I felt super productive. The site was faster, it was better performing. But was that really productive? I don't know, we can have a conversation about that.

So I think what we like to do here is take this opportunity to bring forth these conversations we've had with our customers and some of our sort of observations and really talk about this idea of sort of productivity.

So what developer productivity is not. And I think it's important to, to sort of talk about this upfront. Developer productivity is not a single measure and it is certainly not simply a measure of velocity. How quickly a thing happened? That is absolutely not a good way to think about developer productivity.

It is also not a point in time at no point is is looking at sort of what happened today relevant to what happens, you know, six months from now, technologies change, products evolve, teams shift priorities change. So having some sense of thinking about this over time really, really matters.

And finally, there is no silver bullet metric. There's no way to say, hey, if we can hit this metric at this level, we're good reproductive. It doesn't work that way, wish it did, but it doesn't.

So having described what it isn't, let's talk about what it is. Understanding developer productivity is a journey and it will take time. It is not something that you can turn on in two days. And all of a sudden you understand the productivity of your team. So it's not gonna work that way. It's absolutely a journey productivity happens at the individual, the team and the system level.

So you pro how many familiar of you have heard of us? Talk about two pizza teams here at amazon and aws, probably a lot of folks. So when I'm talking about a team here, what I'm thinking of is the amazon context of a two pizza team where we build and run what we build, right? So in this dev ops world, we have small teams that own and, and run their thing. So individual team and system productivity, there's no one place that tells the overall story.

You can have a super, super productive individual developer on a non well performing team with a pool product. And that really doesn't tell a complete story, the objective measures what got done typically, when we think about productivity, certainly in the conversations we've had, there's been a lot of focus on how many and of what got done. And the short answer is that is only part of the story.

This idea of subjective measures how people felt about what got done has certainly manifested as something that's very important and not to be underestimated in this world of generative AI.

So what are we hearing from customers. What are we, what are we hearing from you? Well, firstly, no one has this all figured out because it is a journey. There are certainly folks who are further along the journey than others, but nobody has this figured out, right? Developer, productivity software development is a dynamic process. It changes over time. So it is constantly evolving.

Culture is very, very very important as we think about any organizational change, you might think about doing sort of dev ops or dev se ops these other sort of organizational things, bringing an organization along with this idea of how to think about developer productivity. That's not just a simple measure is going to take time. And so the culture has to be open and willing to change and shift and evolve.

It is neither a top down nor a bottom up effort. While from the top down there may be a man, hey, we need to get more efficient, that certainly isn't going to impact individuals and teams with very specific things. This team needs to do something very specific to get that outcome and from a bottom up standpoint, yes, as developers, we may really love the latest tools and that's really cool, but that doesn't necessarily flow up into a way that that actually supports um team and systemic productivity.

So what generally we see is that from a top down perspective, the executives need to be supportive of the fact that this is a journey yes, we'd love to get to a place where we understand our pro our productivity and we can increase it and we know what led us to pull. But that is going to take time from the bottom up perspective. We need to make sure that our developers are engaged. We need to make sure that they're engaged, empowered, trained, supported through the process of adopting new tools, new ways of working. Because generative AI is a new way of working, learning how to do prompt engineering and all of these things, it is a new skill. It takes a little bit of time.

There is absolutely more to productivity than code written. The most productive outcome may in fact not to be right, any code. So we'll talk, we'll, we'll dive into that in somewhat, right? But this idea that productivity is driven by lines written is, is again, it's not not ideal homogeneous versus heterogeneous uh environments.

Conversation goes something like this customer. What kind of uh what sort of development languages do you use? Well, we're 90% java and then we've got a little bit of python, right? And we're primarily using um intel or something. That's our id. On the one hand, customer help me explain. So help me understand what kind of languages he use, we use all of them. I'm sorry, we have a variety of languages. We use a whole variety of different things. We've got some legacy code, we've done some mergers and acquisitions, we've got a whole fleet of different languages that we support and then tooling to, to agree.

And so in these two different ends, you can sort of envision that instrumenting, understanding how those things work, understanding the different impacts in different projects would be very interesting. And as part of this productivity story understanding your internal complexities are really critical.

Finally, this idea of maintenance versus net new development. If I have a legacy code base that i am responsible to maintain the definition of productivity in there may be very different if i'm going flat out building something net new to deliver net new value to customers. And again, the way we think about productivity in those worlds is going to be slightly different and we need to be aware of that.

So having said all of this and said the productivity is kind of complicated and it's not a single metric. Do we have ways of thinking about it? Are there ways to actually sort of think about this space and uncannily? Yes, there are.

So how do we think about it here in amazon? So this is not just aws this is sort of a pan amazon way of thinking about developer productivity and it really is designed to take a sort of holistic approach.

So first we think about system health. So this is the outcomes of the system. So things like um feature adoption, customer satisfaction, reliability, security. Um and and sort of resilience sort of sit in this system. Health idea, right? Software delivery health c i cd efficacy

How effective are our actual processes for going from an idea to actually getting code deployed? What are the manual bottlenecks? How many times it builds failing or do we have security issues? How are we doing that all falls into this idea of software delivery, health and thirdly team health developer, well-being, how are our teams feeling about what they were able to get done? Did they have to crush it in order to deliver software for re invent that would never happen? And now are they ready to to take a break? Right? Like these various things all sit together and especially in this idea of a two pizza team where we're operating, what we build, the health of that team is critical to be able to deliver um service and capabilities for, for our customers.

So it's one way, second way show of hands anyone seeing or come across the space framework as it relates to develop productivity, it looks like not many. Ok. So space is a framework. It was a uh research project between github microsoft and university of victoria. And they think about productivity across these five different sort of dimensions, satisfaction and well being satisfaction. How happy am i in my job? Well being, how does that make me feel in my life? The data for this is typically gathered with surveys and you're looking for qualitative feedback. How do i feel about what got done? Am i feeling happy in my job? Am i feeling burned out? These are very important in terms of overall sort of productivity, performance, performance here really aligns with our idea of system health. And here what we're focused on is outcomes, reliability, security, customer adoption feature usage, those types of performance metrics, the actual outcome of the work that we did activity here, we're looking at actions and outputs in this world. We're sort of looking at things like are pull requests, work items. We're looking at deploys deployment, success, deployment, failure, the actual success of the pipeline and also then outages impacts severity, those types of things fall into this activity, communication and collaboration. I'm sure we have all heard many times that for teams to be effective communication and collaboration is key. That's true in software development as for anything else. And so in communication and collaboration, what that generally translates to in software development is transparency, transparency for your team, transparency for others. And so the way that this would manifest is if i'm communicating, collaborating, what we may end up finding is that we don't actually need to write code today, we can avoid rework, we can avoid duplication through collaboration. And so what's interesting between activity and communication and collaboration is if we're effectively communicating and collaborating, we may actually avoid writing code, we may actually avoid doing work. but that is actually still incredibly productive.

So within communication and collaboration, the way we would measure that would be, how easy is it to surface information about this tool, quality of documentation? Also, uh we think about the availability and quality of expertise. How good is the team at providing code review? Understanding what the the system is currently doing? Also how easy is it to integrate this particular piece of software, this particular tool with other things. And finally, within this space, we have this idea of on boarding something that we have heard consistently from our customers is we're looking for gen a i to help and ease the process of on boarding new developers into either organizations, projects tools capabilities. Can we do that here using generative a i? And that absolutely falls into this idea of communication and collaboration as part of the space framework.

Finally efficiency and flow. What we're really talking about here. And i'm, i'm probably familiar with the term flow and it's really this idea of how do we get into a zone where we're just super engaged, super productive and working really hard. And it's actually fun. And what we're looking to do that when we stay in flow is to avoid um distractions, avoid contact switching, avoid meetings. We all love to avoid meetings, um avoid manual work as part of a pipeline, avoid build failures, those types of things really to allow us to focus on complicated tasks and get them done without interruption.

So when you think about developer productivity from the amazon perspective, we think about it in those three dimensions, space thinks about it across five. But there's another one, how many folks are familiar with dor dor metrics in terms of software development, some folks. So dora is an output of google cloud. It's a research project that's been going on for a number of years. And so that what they've done here again is look at what constitutes productive organizations, what makes that work? dora takes a slightly different perspective. They look at a set of 11 technical capabilities combined with four organizational capabilities and use those to predict these types of performance outcomes. So in the aws world, we talk about system health in space, they talk about performance. And in dora, we're really looking at these types of outcomes, right? Deployment frequency, lead time for change, failure rate time to restore these metrics, these performance metrics then impact your overall organizational kpis, right? Like are we generating, are we getting customers? Are we getting adoption those types of things? Also with dora? It looks at um three of the organizational capabilities as specifically driving developer well-being in that area, we look at things like deployment. How comfortable as a developer, am i about deploying new code? Is it going to work? Am i going to get it kicked back? Is it going to fail? We look at rework, how many bugs have to get fixed? How many times is this getting kicked back? Do we have to repeat things over and over and over again? And finally, that leads to burnout?

So what i think you're going to see across these three frameworks is actually the impact on the individual developer is critical to this overall idea of productivity and it is not simply what did they do but how did they feel about doing it?

So with that said, what happened with a i? So just over a year ago, i think we all on a friday said, wow, this is really cool and then sort of the world shifted. And why is that? We had a confluence of data compute and we saw these uh large language models appear. And so what did that do for developers? And how did that manifest in, in our world?

So where does generative a i play a role in, in sort of this idea of software development? And so what we see is this idea of satisfaction. So anecdotally having talked to customers, we've heard consistently feedback like, oh, it's so much, it's, it's more fun. i enjoy software again. i'm enjoying development again. um and there was a really interesting piece i think in this year's state of dev ops part of the dora um reporting the 2023 state of dev ops and i'll paraphrase, but it talked about um generative b i is a lot of enthusiasm for it. it may take a while for broad adoption, but there is correlation to develop a well being, right? the use of these tools is actually making people feel happier about their jobs. And we've got more detail on that in a second velocity. How is ja i speeding things up? I said at the beginning, it's not all about speed, right? but speed can be an after effect and it can actually be helpful. And so generally within software development, we're gaining velocity by delivering more information and appropriate context to developers when they need it where they need it. So typically the way i would think about this, i'm sitting in my id e, i'm starting to type some code and i'm getting a recommendation or a code suggestion or something when i need it where i need it, that will allow me to complete the piece of code, the function, the thing i'm working on, thereby increasing my velocity, right? But that recommendation has to be relevant, it has to be useful and it has to be good quality.

We're also seeing outputs of generative a i tools as increases in quality. And i'll give you a simple example of this using generative a i tools like code whisperer, you can very easily test cases and generally a measure of software quality is what's test coverage. And so what we're seeing here is that sulfur quality is improving because it's much easier to create test cases, very, very easily security. What i heard i was talking to uh talking to a ciso this morning and i've seen this before. But really when we're talking about software development and security, how do we give developers immediate feedback in their id to say, hey, that thing you just did is not ideal, there's a better way to do it. And so with code whisper, what we've been able to do is include static code analysis or code scanning as part of the tool in the developer's id. Think of it as a way for developers to check their work, right. I've been writing some code. I want to validate my project. I want to validate the logic before i commit and before i create downstream impact. And so one of the things we're actually able to do with generative v i and this was launched in the past couple of days is automated code remediation. So now as i'm a developer and i'm working away at my code and i run a security scan and i validate my work, i may find issues. But now we're also going to give you a recommendation of how to fix that. So instead of saying this is a problem having to go to google figure it out, right? Come back and fix it. Now you're actually getting that remediated code right there in your id, your concept switching goes down, you can actually fix it, you will learn from it because it's happening in real time, submit your code away, you go and you're not creating a whole bunch of downstream impact, getting up to speed.

As i mentioned, we've heard a lot from customers who are really have this idea of how can we use generative a i to speed up the process of getting our developers up to speed with code bases with what it is that we do with our standards, with our api s and our frameworks and so forth. And so a couple of months ago, we launched code whisper customizations, which we tested out internally first, right? When we launch code whisper, when our internal developers were using it, like this is kind of cool, but it doesn't really help us understand the way that we do things here at, at amazon and so customizations, the idea there is that you can bring your own curated set of code, typically reasonably high quality, reasonably secure, probably relatively new that represents patterns. You would like to see repeated, bring that in create a customization, your developers can connect to that. And now whether they start looking for code recommendations, they're getting something that looks like your internal code base, right? They don't have to um one example of this, we work with customers who've created sort of abstraction layers over cloud. So instead of hitting s3 directly, they would have like an internal storage api which might then hit s3. And so what they wanted was their developers to get recommendations for using their frameworks, not our services directly, right? That's a perfect use case for something like customizations, enable your developers to get up to speed with the way you do things and finding this idea of technical debt.

We've all seen it. I'm sure we all have it. I'm sure if you sit and think for a minute, you can think of technical debt that you really love to solve. But so how does ja i help us in this world of technical debt? I think of ja i as a way to that will enable us to fix technical debt at scale. And i'll give you an example of that. So yesterday, we launched code whisper transformation or sorry q transformation. The idea there is that i can point q at an existing project and the way we launched it in preview yesterday was java eight and java 11 to java 17. So i would point q at that and i would say effectively upgrade and so as part of um workflow and other things, right? We'll go through a workflow process of working out how to upgrade that particular piece of java eight or java seven, java 11 to java 17, fixing that technical debt problem of we're running on old deprecated run times, we need to make that happen. And so once you've selected your project, you click go, there's a variety of work that happens, creating additional test cases, building a managed development environment, compiling the code, testing it, running it, iterating testing and returning and eventually we'll develop aaa work plan. Developers can say, you know, here's what we think we need to change is that ok? Developers go yes, if you can, yes, we'll make the changes and now your code's ready to go. It's been upgraded. We also preannounced the ability to do that same thing for.net. So from.net to.net core and ideally what we will see likely happening is that other use cases for that will also happen, this idea of transformation, right, using generative a i to solve technical debt at scale.

So i've mentioned satisfaction. I've talked a lot about developers and their feelings and how they're feeling about what it is that they're doing and we saw it or heard about it in the the state of uh devops report from dora. But what we're also seeing is here in this mckinsey study, same thing, right? And it's i felt happy, i felt happier using ja i tools. I was able to focus on more satisfying work with ja i tools. Yes

And I was in a flow state more. And I think we can make, we can sort of logically figure this out if I'm using generative AI to take off that heavy lifting, that sort of the basic things that we do every day that we prefer not to do. And I can focus on um staying in flow and working on a more complicated task, right?

That all sort of makes sense. And so we talked about developer productivity is not a single metric. We said kind of what it is, talked about a couple of different frameworks and ways to think about it within your organization. And now we've got to a place where we've got some gains, right? We've done this work. We're, we're feeling good. We've, we've gained some productivity, we're able to deliver against our KPs more effectively.

What do we do with those gains? And I think this is an interesting conversation that probably if you think about your organization, you may have different answers depending on what it is you're doing. So firstly, do we want to take those gains and reinvest in technical debt, do we want to continue to pay down our tech debt so that our fleet becomes more manageable, more operable, um more stable? Do we want to do that?

Do we want to take those gains and plow them into feature development? Net new features, add new things, deliver more for our customers and, and innovate faster or perhaps we want to take those gains and give them back to the developers so they can play right? They do a little R&D, play with some things, do some really innovative stuff and come up with net new ideas.

And I think the the answer for your organizations is probably going to be different, but I think it's worth thinking about this journey of productivity when we get to this place of having been more productive, what are we gonna do at that point?

So having said all of that, what I'd like to do now is introduce um Ron Kimchi, uh the general manager of Cloud AI to walk through sort of their journey in this JI space. Thank you.

Hello. Uh so I'm Ron, I'm the general manager of the Cloud and AI team in AWS. Uh so I lead the service teams for Elastic Disaster Recovery and Application Migration Service. Um, so, first of all, what is a service team in AWS? So it's basically the team, the engineering, the software development engineering team that builds AWS services. So it's a large engineering team. We are a product team and our software developers and I want to share today how we've started to use Code Whisper in our team and how it's increased our productivity.

But before we dive into how it helped us, I want to give you some context about what we do and about our challenge. And that's really important in my perspective when we talk about productivity because it's all about the complexity. The complexity generates variance between the tasks. If I had a long list of very, very small and simple tasks, it would be pretty easy or relatively easy to measure productivity. But because of the complexity, we have variance between different tasks, it's also more difficult to estimate or to assess in advance which tasks will be more complicated than others. So measuring productivity is a big challenge when you have big variants.

So let's dive into the details. So you can better understand what we do our challenge and then we'll talk about how we have integrated Code Whisper into our processes.

So Elastic Disaster Recovery and Application Migration Services share some common code. I'll use Elastic Disaster Recovery as the key example of what we do. So basically EDR or Elastic DR is used for on prem to AWS disaster recovery, cross-region, DR or cross availability zones, DR and we have some uh pretty complex infrastructure and there are few pillars.

The first pillar is our agent who does block level application. Now, block level application has some inherent complexity into it. Our goal is to have zero or minimal impact on our source servers which we replicate across regions either for migration or for disaster recovery. Second, we want to minimize data loss for customers if they need to fail over in case of disaster recovery. So that means that we need to keep very small and very rapid incremental of data.

We sit in a very critical path, meaning we have a user space agent with a chrome driver that intercepts writes, it has to replicate those in a very rapid pace in order to meet very high and demanding change rate workloads. So that's one core pillar of the technology, we replicate those continuously to AWS. We want to launch those when customers want to recover.

Now, if you look at an on prem service, when you have a perfect replica and you try to run that infrastructure on the AWS, most likely that it will not run. You need to convert that machine that operating system in order to run natively on AWS. As you all know, there is a very huge variance of operating systems and distributions, Linux and Windows. So there's a lot of intensity and a lot of depth in that part as well.

Now, that's a product, but the challenge becomes even greater when you need to build for scale at AWS where you have so many customers using your services and the workload and the scale is so huge, the bar is simply extremely high.

So we'll start with the security bar. Obviously, the security bar is the highest. And as Joe mentioned, shifting left security findings at an early phase is just extremely productive. We would find them eventually, we did that all the time because we have very, very extensive processes to make sure that the security bar is higher. But as early in the game that you find those uh especially the the more common junior like issues. The better second is the quality.

Again, we always had an extensive coverage in tests and various test suites. But having that with Code Whisper, just boosted our productivity, we'll talk about that and obviously, the operational excellence, our services need to be available for all of our customers all the time. As simple as that. And that also brings a lot of challenges into the processes.

So let's talk a bit about productivity boosters as we see them in the team. So first of all, collaboration and teamwork, we work in teams, work together. Uh I have multiple occasions in my team because it's a pretty large team, but we try to consolidate the locations as much as we can. So the teams and team members can work together and collaborate as much as possible. And that's the biggest productivity booster I know of.

And then we have processes, if we have inefficient processes, productivity will be impaired. So having software life cycle development processes that are effective and are agile enough and are responsive to the changing changes is also key, but I want to focus on motivation and technology and how they work together.

So motivation, well being developer happiness is one of the best and most important productivity booster that there is in my opinion and technology is a key part of it. So let's see how we've implemented Code Whisper in the team.

So it wasn't a mandate as a leader of the team, I did not enforce it on the team member and it wasn't really a bottom up process as well. It's just something that as an ongoing state of mind is an approach for engineering. We give every one of our employees the room and the space to suggest things to innovate and to improve our processes. And we saw that it was really rapidly adopted across the team.

So first of all, I'm gonna talk about some highlights from what we've seen. Our senior SDSs and senior SDSs are obviously software development engineers that are very senior doing all of those very complex tasks and their time is invaluable to the team. And the main takeaway was that they can simply concentrate longer in coding sessions just because Code Whisper shared some cognitive load of the more basic stuff that they had to do, but we're still time consuming.

So having senior SDSs concentrating for longer on complex tasks is invaluable and it's a force multiplier to the team because they help less experienced SDSs all the time. They do a lot of code reviews. That's also a key part. They're so experienced in reading code that it was very effective since they want to adopt Code Whisper for them.

We had some other highlights. Unit testing was a big win. We saw that it just again, the bar in our case was the same bar we would write those anyway, but it was just so much faster and the effectiveness just was a huge surprise, a huge positive surprise for all of the SDSs that used it in the team, including the more junior ones and the more experienced ones.

Another point is integrating with other AWS APIs. So as you probably know, AWS services are integrated one with the other, so we don't want every team to build their own logging framework, for example, and their own storage, etc. So we use other AWS services and the way we do it is by using the AWS APIs because the organization is big large and that's the best way to make sure that you keep backwards compatibility between the various teams and components.

So we saw that Code Whisper gave a huge boost on working on projects that required a lot of intensive API work. It just prevented or saved a lot of context switches of going and reading a spec and going back to the IDE and everything was just entwined in the IDE. We saw that as a huge win for the team and especially for the SDEs that had less experience with specific APIs which were new to them in that case.

Comments was also a nice win. It's just auto generating comments. It's not that it's the most complex or challenging task series, but taking away that cognitive effort on an ongoing basis is just super important.

And the most important thing for me as a GM was the fact that most SDSs that started to use that just said it's simply more fun to work with that new cutting edge technology. It feels like they're doing new stuff that are exciting and they simply enjoyed it. And as we said before, well being and happiness and boosting motivation and the well being of our employees is our top priorities.

Exactly as we serve our customers, we serve our employees. And in that sense, that was I think the biggest win for me.

So few query reflections.

First of all, we're working on uh adding new functionality like customizations that Joe described, which we are looking forward to see how that will impact the team. As we said, productivity is not a metric. It's a process, it's a journey, it's ever changing. And we need as managers and team leads to be always attentive to the changes in the environment and respond to what our team members need.

Uh and it's the very early days you see, it's, it's, it's so exciting uh to see after. So uh a lot of years of leading engineering teams, I can't imagine how it's gonna look like in a year or in two years and how it's gonna change the way we work on the day to day basis.

Uh so it's really uh fun and exciting and the space is so wide that there are a lot of low hanging fruit and risk free places that you can start to explore. So we had a very positive experience uh in our team. And uh now I went over to Jessica to let you know what's coming up next.

So we've heard a little bit about what is productivity, how should we think about it? Ron just shared with us um his team and how they have adopted it. And so I want to do for the next couple of minutes is what's next? How do you think about incorporating tools like code whisper and the broader developer portfolio into your organization?

But before I do that, I'm gonna take a quick step back and just look at, you know, how is generative AI really changing the way that we're thinking about the developer experience beyond just productivity in terms of velocity and customer feelings and system health. But how do we significantly improve that for the developer?

And research shows in the customer conversations that we've had, it's about equipping them to be productive so they have a better experience and it's not just about writing code but the broader SDL C. And so some places that we're looking at include, how do we think about the design phase when you're just about to get started? A lot of managers and team leads and developers, they need to think about what's the architecture. How do I think about leveraging tools and I generated tools to get that first step. So I can understand where are the complexities and the impact to my development teams before we get started.

The second piece is around understanding and learning. As you guys all know, developers probably face challenges and trying to figure out what's the right documentation. Where's that code sample or what are the best practices for my organization? And how do I find that when I need to do it in the task that I'm doing right now?

And finally, as we mentioned, outside of where developers work in the IDE, many places across the SDL C are team based. So how do you think about collaboration and communication in between and across your organization?

And so I think you've hopefully seen how AWS is starting along that journey. We're trying to reimagine how we build what we build and who can build on AWS. And so the how piece we've talked a lot about code whisper. We've also announced Amazon Q, which is your interactive expert assistant for when you're building operating and managing these applications on AWS.

We're inventing what you can build. Hopefully, you have all heard about Bedrock for building applications with generative AI uh along with foundational models. But we've also launched the Amazon Q feature capability development that's going from a natural language prompt into a feature application in just a matter of minutes.

And we're trying to expand who can build. So as someone who hasn't touched code in probably 20 years, general of AI really opens that opportunity. And so hopefully, you've heard of our service Party Rock. And so I was able to build an application in a matter of a couple of minutes without having to understand the syntax or the programming language or anything like that.

And we're also launching new services or new capabilities across other services for the front end developer with a code first approach and amplified and underlying that we all hear from our customers, the need for security privacy and the responsible use of AI within their organization.

So what we wanna talk about is being able to find that balance between all that gen AI has to promise the spark of innovation, the transformation of industries to some of the potential risks and challenges that we've heard in a lot of these customer conversations that we've had and so in talking to our customers, there's some common themes around.

Well, how do I think about understanding how the model uses my data? How do I think about understanding the model training? Um how do I weigh the tradeoffs between speed and velocity and the quality of the content that's getting generated? And what is staying up to date in this kind of emerging, evolving space really mean?

And so it's that balance that you're going to have to work across your teams across your broader organization to be able to take advantage of gen AI while ensuring safe, secure and transparent development uh with the, with the technology itself and a lot of that responsible adoption requires thorough testing. It requires establishing processes and principles and mitigating any potential misuse of the technology.

And so when we're thinking about ways to get started, we should be thinking about responsible AI what's acceptable within my organization? How do I think about the governance of who has access for what use case? How do I think about um the explainability? Well, I when I get this content or this code recommendation or the suggestion, how do I attribute it? Is it my code? Is it your code?

And so that balance is often the starting point for organizations as they think about these tools. And so the way that we like to think about it is can we use the AI powered coding tools as your, you know, your first toe in the JAI waters and biased amazon code whisperer is a great way and a risk free way as as Ron mentioned, for us to go try that.

First of all, it's where your developers are, right? It's really easy to get started in the sense that um often developers are working in the IDE in the CLI. So it's easy to download the AWS tool kit and start coding right away. So you're not disrupting the general processes and the workflows that you've already established within your organization to test out uh and try out this new tool.

Second, I'm sure everyone here has the governance, compliance and security policies and so as as Ron mentioned, the ability to shift left and introduce security earlier in our life cycle, code whisper enables you to do that. And so not only do we have the security scanning function that we um that we had at launch.

Um as Joe mentioned, we recently launched the ability to auto remediate um with code suggestions to address those security or code quality issues. And the last piece is where obviously all of our organizations, a lot of our customers are thinking about the responsible use of AI. And so code whisper gives you an opportunity to start thinking about what are those processes and guard rails that you want to put into place.

You know, we've worked really closely with our legal teams to curate that underlying data that supports the model that makes the suggestions. Um but more importantly, we have a reference tracker feature built into the service itself. And so if your code looks or smells like anything out in the open source, in the open source world, you can actually look at the reference, see the source code, see what licenses it can be attributed to and decide whether or not you want to include that in your project.

And so those are some of the key ideas in terms of how you should be thinking about more broadly what your gen AI strategy is, but rooting it in a very tangible tactical way with a service like code whisper.

And so as Joe mentioned, we have the likely privilege of talking to a lot of customers. And so what I wanted to, to end the kind of a conversation here with is how, what are the common patterns that we've seen? How can you walk away with some practical guidance of getting started with generative AI tools within your organization?

So we've just talked about kind of establishing operational principles and guardrails. So today, teams are often faced with a variety of standards policies, tooling that can create friction as part of the overall development process. And so many of our customers are working hard to create a paved path. How do we reduce the time from getting our code from idea into production?

And this includes developing, you know, clear guidelines for the appropriate use of the content. Um and how do we work with our legal teams to ensure whatever those considerations are? We've also heard a lot of our customers um understanding how to implement the human review process before publishing or acting on any of the code uh that's actually generated in these suggestions.

So we see with gen AI, we're moving somewhat away from actually writing code to more about reading code and understanding what it means for your architectures and for large scale system design. And so how do you incorporate your monitoring systems to detect harmful or biased or even low quality suggestions? And what do we need to do to incorporate um you know, additional checks or processes in order to understand what it means for your uh for your application.

Next, define success. We've talked a lot about what the metric of, of productivity might mean. Um but align within your organization, what are the key performance indicators across productivity, creativity, um accuracy and the developer experience. And so this will be a combination of subjective feedback, uh surveys, focus groups, understanding how your developers are feeling about their work and what they're working on as well as objective information.

So Joe talked about, you know, pull requests, security scans, um successful deployments. It will be a combination of those that you'll have to work across your organization or your teams to know how to define that. And so for those who are earlier in their gen AI journey, this could mean understanding how to instrument to collect these metrics across your SDL C and getting a baseline of where your starting point is.

So, you know, when you incorporate and when you start adopting JAI power tools, what is that delta that you're going to be receiving? And for those who have already started down that gen AI journey, um this could be looking at setting thresholds for acceptable rates of error or safety or bias within that.

And as we build and evolve um our application development process with these tools, you will need to continuously evaluate the outputs the outcomes for what it means to measure against success. You know, we're talking about how do you reinvest that productivity gains? Um and put that into this ongoing journey that we have around evaluating productivity, recruit an executive sponsor and figure out what that change is.

Many of our customers are establishing um executive steering committees. These can be the north star for how your organization is thinking about leveraging gen AI. So they can influence corporate mandates business strategy, but they're also tying it back to what and how your development teams are able to deliver. And it's that organizational direction and that cultural change that's needed and is critical to be able to secure that buy in.

Because Joe mentioned, it's both a top down as well as a bottoms up initiative and working together to figure out what we're actually gonna do. And so having that bellwether, having that someone, a spokesperson for you to advocate around gen AI incorporating that into your organization that often leads to technology decisions, policy updates and organizational changes.

And many of our customers have started to put together successful teams that span across developers, those that are using and operating these tools that are building applications for your organization, a technical lead, who can lead the direction like Ron's uh senior SDEs or leading the direction can influence um how the organization is thinking about these as well as a program manager.

How do we think about coordinating a phased roll out of these tools um and ensuring there's adequate budget and staffing and resources and enablement um so that you can be successful. And so there's a bit of this cultural change that's going to be needed to ensure, you know, broader adoption of these types of tools and finally prioritize education and diversity, right?

For most of our customers, the end goal isn't just generative AI, it's often along the lines of what's the fast flow, what's the feedback, how do i spark creativity and innovation for my organization and helping teams create kind of that streamlined approach of how to do that. Um often includes finding the expert builders who can lead with influence through TED talks or training sessions who can show the excitement of the art of the possible with these types of tools.

But I think with generative AI, we've talked a little bit about, you know, what is the opportunity to redefine what it means to be a developer? How do I expand the community who's been able to build and operate applications within your organization? And so it's evolving our skill sets to abstract away the undifferentiated, heavy lifting and inviting anyone to be part of this.

And so think about recruiting, you know, folks across different seniority, tenure educational backgrounds and provide them that ongoing training and enablement and education and have that and foster that inclusive community so that your influence is not just within the development teams, but more broadly across your organization.

So this is just the start of our journey to reimagine this developer experience on and with AWS. And so we're just scratching the surface of what it means to boost productivity with tools like code whisper or Amazon Q. Um and we're really dedicated to empowering developers like you to accelerate um and create and shape the future.

And so thank you so much for your time and I look forward to seeing what you all can build.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值