Innovation talk: Emerging tech

Welcome, Vice President Engineering AWS, Bill Vass, everybody. It's always exciting that it's an amazing time that we're seeing out there and as the machine learning models and everything continues to grow, it's really exciting. So we're going to talk a little bit about emerging tech. We're going to talk a little about sort of our road maps and why we deliver what we deliver and some of our plans.

You know, back in 1978 I'm going to age myself a little bit. I worked on the early neural networks for autonomous underwater vehicles for navigation. And, you know, we're building some of the same foundation that's in the, you know, linear algebra based models and things like that, that we're working on today with all the interconnects and things like that. But, you know, there just wasn't enough data and enough compute at that time to really make it work. So the submarines got lost all the time. And that was very frustrating for my boss.

Can you advance to the next slide? It doesn't seem to be working there we go. And so with things like the cloud now we have enough compute and enough storage to actually do it and make these things real. My boss used to call it artificial stupidity because the thing would get lost all the time. And that was always one of the challenges. And now with the boom of generative AI, you could start doing a lot more in this area. The other thing that we're starting to see with the amount of compute we're, we're getting is what I will call full fidelity digital twins. And that allows you to create complete software defined environments that operate in the cloud the way they do in the real world. And that's pretty exciting. It really accelerates high performance computing and machine learning and allows you to apply to every part of your business kind of end to end.

So, you know, back in the nineties when I was working on a lot of the early internet components, we were really excited. I mean, when you'd see a url in a in an ad, you'd be excited because it was a url, right? And that was, that was the exciting one and we thought that was going very fast, but things are moving much faster now. It's accelerating five times faster than the early internet age did and the growth there. And so that was me at when I was CIO at the Pentagon, not CIO, one of the CS for the Pentagon. And that was Amazon Web Services what it looked like back then. And this is what it looks like now and what it looks like for me. Uh, fortunately, uh, unfortunately, I don't look as good as, as, as Amazon Web Services does. I haven't done as well over the ages. Uh, but it's really pretty amazing to see how things. So you imagine how quickly that has changed, it's going five times faster now. I'm really amazed.

And so why is emerging technology so important and why do we focus on and done so much? We kind of make the future in our business here in a lot of ways. If you're not staying up to date with emerging technology, you're going to get outperformed by your competition. I talked to a lot of leaders that say they don't want to get Ubered. I mean, while I was at the Pentagon, we worked on the GPS satellites and I remember when we were deploying those GPS satellites, we were sitting around talking about, wow, there's going to be a screen in your car and it's going to navigate for you. But we had no concept of something like Lyft and Uber. When you combine a smart car, a smartphone, the cloud and a GPS satellite system and you disrupt all transportation systems and that's the kind of disruptions we're starting to see. And I think with what's coming with Gen AI where it's going to be just embedded in everything you do that's going to just continue to do it. So you really, you really need to stay on top of technology all the time.

Uh so we'll talk a little bit about our road maps. Uh you know, things like Nitro uh uh and Graviton and training, they don't just happen overnight. Those are planned years in advance. We knew that machine learning would be coming. That's why we focused on training uh we knew there would be need for ARM based processing in the cloud. That's why we focused on Graviton. You saw the launch at Adam's keynote of the high-performance S3. We started planning that in 2015. And so we, we knew that was going to be needed as well. And my law, I'm sure we'll talk about that when she gets up here.

So the other thing we talk about a lot is flywheels at Amazon. And so this is the way we look at the flywheel. Now, we would have liked to deliver all of this at once and all fully formed. But two reasons we couldn't, one is that all the technology didn't catch up in time to do it. And, and two, it's tremendously complex to do this and you see this flywheel every time it turns, things get better and things improve. So let's talk about the flywheel.

So connect and collect is happening everywhere. Whether you're, you know, streaming in web, clicks from your website or connecting on IoT or connecting the phone, whether it's database or TL connect and collect is happening everywhere. We make it possible for you to store and manage real and synthetic data at the exabyte scale and potentially even the zetta bytes scale. Everything's becoming software defined. If you're not software defining everything, software is eating hardware, then you're falling behind and you should think about all the ways you can software define your business and your processes. And once it's software defined, you can build it as a digital twin and then you can run it in the cloud in the test environment, you can run it in the real world as well and you can run even now in the cloud in the same binary that you can in the world. So you can run on an ARM processor in the cloud in an ARM processor in a car, for example, or a control system or something like that.

And once you have it in the cloud, you can test and simulate it. You can do things you just can't do in the real world. You can drive millions of miles an hour in an autonomous vehicle. For example, in the cloud, you can't do that in the real world. And with all the advances of AI, you can optimize it. You can have it take a look at all the possible combinations that could ever occur that you could never conceive as a human in the normal environment and leverage things like high-performance computing. And then we make it really easy for you to push it out and operate it right. And once you push it out and operate it with that improvement piece, it just starts over again and it gets better and better.

Uh so let's talk about these in detail. So first of all, if you're going to do this and it's especially all the connect component, you need a really, really strong foundation, a really strong foundation of security. Security is job one at AWS and it has to be very scalable and it's got to be sustainable. We can't run, you know, all of this and use all this power. If it's not sustainable, it's got to reduce cost. It's got to accelerate innovation, it's got to increase your performance and it's got to be a better customer experience ever improving.

So at AWS, we focus on what I call military grade security. That's why you see us win so many of our, our government contracts and things like that because we have the strongest foundation of layered encryption level six certification, firewalls, all of that kind of stuff and over 300 cloud products that are focused on security and we have end to end scalability as Adam mentioned and re emphasized having multiple availability zones that are physically separated by many kilometers, not all on floors of the same building.

And our other goal is to provide from the edge to the cloud, a common set of APIs deployment and management infrastructure for you and were very sustainable just by moving to the cloud, you're 3.6 x more energy efficient and you reduce your carbon footprint on average of 88%. And we're not just doing it in the cloud. We're doing it for our delivery vehicles. We're doing it everywhere. We're on the path to be 100% renewable energy by 2025 and net zero by 2040. So we're very committed here. We're 10, we're eight x bigger buyer of renewable energy than any other company.

So let's talk about connect and collect oops. Can you go back, please? So there's a vast amount of data, trillions of transactions coming to the database every day. Web clicks are being tracked, sales information, logistics supply chain. Um you know uh purchases from your partners, purchases from your customers, all this data coming in data coming in wirelessly through WiFi and 5G satellites and more.

So let's do a quick survey. How many unique device connections does AWS manage today for IoT devices? Not the total number but how many unique device connections? So we'll give it a period of time to poll here. Wow, 1.2 billion. So 270 million. That's interesting. So interesting. So we have 3.3 billion devices under management, but about 270 million of those connect every day because not every device connects all the time, every day at any one time. Pretty amazing, isn't it? When you think about it, it's it's kind of mind boggling.

So some of the things you might not see in some of our launches that are actually pretty important in this connect environment. So at Re:Invent that we launched this transaction system to make it easier to connect and pull in EDI transactions from your field offices and partners and logistics sales organizations and things like that. And EDI is still a very common method of bringing in data and we just made it much easier for you and the enterprises to do that.

We've got our Local Zones where you can just turn it on and Sidewalk and have it immediately connect for low bandwidth and low power connectivity. We already cover 93% of the United States. We've got Private 5G where you can run it on a Snowball or a Snowcone or an Outpost at your site. Uh and we have our Wavelength Zones as well and of course WiFi.

But what about connecting everywhere? You may have seen it, Adam's keynote. The launch of Kuiper - Kuiper allows you to create a VPC from the edge device all the way through to your EC2 instance as it loads out there. And we get all those satellites out. It's a lowest orbit system, less than 50 milliseconds, lights on connectivity and one gig of bandwidth. Pretty amazing. It allows you to connect all those remote sites in kind of roam between all different types of wireless connectivity. So it really gives you the ability to be connected all the time everywhere.

And of course, we recently put a Snowcone on the International Space Station that's doing edge processing in space and collecting data. So you're going to get a chance to try it yourself. Right now on the App Store for Apple, you can take and download this app if you take a picture of it. Here, it turns your phone into an IoT sensor and you can shake the phone and watch the data stream back into the cloud. It doesn't save any data in the cloud, but you can bring it up with a desktop next to it. Like if you have your laptop with you, you can send yourself the URL for it from it and bring it up and shake your phone and watch it change on the cloud.

Now, the cool thing is about this, you can see the log data in JSON streaming off on your phone in real time. So you can see what we're actually sending. So you can confirm we're not sending any PII data or anything like that. And you can go over to GitHub and download the code cut and paste it and turn, turn any app you want into IoT. So go ahead and take a, take a chance, take a picture of it, download it and give it a shot. So you can be connecting and collecting while I'm continuing to talk.

So you're going to have a lot of real data coming in. And as I mentioned, we've got lots of ways to get it in whether it's Kinesis or Snowballs or transferring ETL or storing it in things like S3 or FSx, whether it's coming from cars or IoT or any of those places, a lot of real data comes in and you're going to need it to save that real data and we make it really easy to save that chiefly. But you're also going to need to generate synthetic data. You need to train machines on how to deal with, say a fire in a city or a disruption of your logistics supply chain. You don't want to do that in the real world and record it to see what happens and then train it. You need to synthetically generate it. And so we're working on ways now...

I'm talking future. It's not all here today, but these are all things in process. We're working on ways to generate synthetic images for our vision system. So for example, on the left side there, I guess it's right for you. You see these are synthetic packages. I'll show you that we generate, you would think at Amazon with, with us shipping millions of packages every day that we have enough images of packages and you'll see that with some of our fulfillment center digital twins in a minute. But it isn't. We need billions of images to train these machine models in the center there, we generate defects in motors so that machine learning models can learn to detect those defects. And there's Purdue where we generate synthetic chicken nuggets. I never thought I'd do that for a living as a programmer. But anyway, and it allows you the computer to learn to do those. You need millions or billions of pictures. So what you're seeing here that is not real. Everything there is synthetic that is generated with the Unreal Engine and a physics engine. It's all synthetic and we feed that in along with the real images to our optical systems to train our models at our fulfillment centers.

Everything here is synthetic. This is Aurora, one of our, our, our great customers. They do 15 million miles a day of driving in synthetic environments. You could just physically never do that. You see it's synthetic. So you need a combination of rich synthetic data and capabilities and rich real world data. And you can see here there is a great visualization of the images of those packages coming from the real data and the synthetic data and how it builds out the matrix in the parameters of one of these models. And that's what you want. You want a dense parameter set to have an accurate a large language model. So you're going to want to keep and manage your real and synthetic data.

And then once you have that real and synthetic data. you can define all your processes and systems as software and feed it in to that software in the cloud, whether you're running for real in the cloud or, or you're running tests in the cloud and synthetic environments in the cloud, whether you're doing, you know, smart cities or factories or buildings or enterprises. And then you can get to a point now where you start doing full fidelity digital twins in the cloud and to enable those digital twins, we have TwinMaker. And so this is thousands of our customers along with our fulfillment centers in TwinMaker. These are real images of TwinMaker. You can create a digital twin of a body, a human, a part of a body, an organ, a car, a ship, an airplane or full factories. It's pretty amazing to see all that.

And you might notice we've just got KVS which is also in your mobile app. If you've downloaded it, that makes it very easy to connect all the vision systems right into the TwinMaker. So you can see what's happening real time on the factory floor. And you may have seen Jensen on stage with Adam talking about the L40S and a lot of that was about machine learning. But another very important point is Omniverse. So Omniverse is Nvidia's full fidelity digital twin. So they're doing 100% rate tracing the raster because you have to be able to create a twin of the real world and this is now getting integrated into TwinMaker as well. So it first time brings the OT environment and the simulation environment together in real time for you to be able to train your environment, simulate your environments, optimize your factories.

But you know, there's still one piece missing and Rayner is going to come up a little bit later and talk about what's missing on this and that's the whole PLC environment and all the the factory automation code that you need to also simulate in this. And of course, no, you continue to see great examples of software defined vehicles in the automotive industry, everything's moving to software defined vehicles. So let's take a look at how high performance computing can optimize the design of vehicles.

So for the last six years in a row, AWS has won the best cloud platform for supercomputing at the Supercomputing conference. And we continue to work with great customers like Formula One and Toyota on advanced design using things like ParallelCluster and Batch to design their next generations of vehicles and win races. And they have to do all this in the cloud. You can't do all this in the physical world anymore.

I suppose I have another question. So as you start to see the merging of machine learning and high-performance computing, do you think you could do something as complicated as doing a computational fluid dynamics and drag coefficient for a car using just machine learning. Who do you guys think? Let's take the survey. I'm curious what you guys think? Got the answer yet. Yes. You know, it's interesting. My team presented this to me in one of our PDRQs and I told them I didn't believe them, but you're actually right, we're getting very close to being able to do that. Let me show you an example of ML optimization where we do just that.

So what I did is I said to Stable Diffusion, one of the Titan models make me a image of a luxury vehicle and it went out and generated that image and then we took that image and we ran it through a neural radiance field by taking lots of different images of it. Now, the important thing to point out here the fidelity for 3D in large language model generation is not there yet, it's evolving. There must be a new paper every week. This is back to the emerging technology part of this. But this is what we did and you'll see how granular it is. It's fascinating to me just, you know, we did this a few weeks ago and today I saw a new image of it that would have been even better, but we'll go into that. But how fast this is evolving, we generated a mesh and then we ran a simulation on it. And you can see this is the mesh that was generated from that image. And this is the simulation. And we're finding right now that the computational fluid dynamics that can be done in minutes, not in, with millions of cores in hours or days is about 98 to 98 9% accurate. That's pretty amazing. Now, I wouldn't fly in an airplane that's 98% accurate. So you definitely want to still do the HPC runs and validate it after you finish your designs, right. But it's really amazing.

And then, so what we did is we did what's very common now, since these are all API enable, we put it in a loop and we had it feed back to the design of the car to increase the drag coefficient. Of course, there's no accounting for taste in the machine learning models. We may need to work on that. But it's really fascinating when, when you see what you can do here.

So after you've done all these things, now you can push these digital twins to production and operate them in the edge and in the cloud and you get this flywheel where it's just continuously improving. And you can kind of see how all of what we've been delivering over the years for the last 15 years is building up to these kinds of things.

So let's see how this applies to a city or an enterprise enterprises have been software defined for years or a factory floor or building automation closer to home a house. What's a software defined house look like or a software defined car or how about a software defined molecule with a quantum computer? We'll talk about that as well. Sometimes it's not how big you can build something that's impressive, but how small.

So in cities, more and more things are getting connected and integrated and you see it kind of across the board this entire cycle happening and new cities being built all the time like Neom and other things like that, that are completely focused on fully integrated digital twins.

KONE is a great example of this KONE runs on all our IoT systems. They're the lead, leading provider of people, movers, escalators, elevators. And so what you're about to see is their TwinMaker implementation of the Helsinki metro station. They've done it for all the metro stations they operate in and this gives both real-time IoT data coming in using KVS and other things like that. You see the trains coming and going, you see the escalators and elevators moving, this is a real time, but the cool thing is they can simulate all this too. They can say what happens if the train breaks down. What happens if a lot of trains come at once? What happens if we're having a concert? And there's a lot of people, what happens? How do I do predictive maintenance? When do I need to pre maintenance? All all these different things? That's the power of digital twins.

Well, software defined enterprises have been around for a long time. You guys have been familiar with ERP systems and, and things like that. They've tremendously improved production and, and efficiency in enterprise or whether it's logistics or CRM ERP systems. HR financial companies like Salesforce and SAP and, and Workday are amazing at doing this and you all interact with them all the time and it's tripled the productivity of enterprises of any software to find where they can change things on the fly. And the organizations, you know, like banks, banks, you know, if you talk to people from, from the banking industry, they're all software defined these days. They don't have just big vaults in places. They're all software defined. And Nasdaq with their matching engine on AWS has been able to leverage the cloud to also be software defined and, and you see these same kinds of things for years in the financial industry where they've been using HPC and machine learning to accelerate their financial analysis as well. And Nasdaq sees a reduction of 10% of round-trip improvement of latency in this matching system by running on AWS and on the cloud.

We also see companies like Woodside who are taking this to the next level, doing end to end simulations and end end operations with AWS supply chain, having digital twins that are physically distributed globally. So we see enterprise is now starting to do what used to be the impossible. They take every combination of every supplier, every combination of every logistics problem, every combination of the digital to one of their manufacturing, every combination of their logistics out to their customers and every combination they've ever seen of customer demand and synthetics. What could happen? What happens if there's a hurricane? What happens if there's a storm? What happens? All these things? They run those billions and billions and billions of combinations in the cloud. They have a machine learning model learning from it. So when any of that happens in the real world, unlike a human who can't consume all that information, the machine learning model can immediately recommend the optimal solution for their enterprise. And I tell you companies that do that are gonna just, you know, blow away companies that don't.

So let's talk about manufacturing and and and factory automation. So I'm going to invite Rayner on stage the CEO of Factory Automation to tell you a little about what we're doing with Siemens. So a warm welcome for Rayner.

Thank you, Bill. Great to be in re:Invent, great to be in Las Vegas after 20 years. And I also must say the city is reinvented and I'm working for a company which history has a history of more than 100 and 75 years and there was a lot of reinvention also in that company to stay relevant. Let me give you some examples. Siemens started way back and they did, for example, the first electric street car in 1881. Nowadays, Siemens delivers the software for NASA to fly to the Mars. And by the way, Bill told me AWS is also involved in that one. Siemens was a company which created the first x-ray system. And nowadays Siemens is the first company building a real digital trend of a human heart which makes a big difference for medical care and Siemens now covered to Factory Automation was the first company introducing a transistor based controller for operating on the shop floor in factories. And nowadays every third machine globally, every third line globally is controlled by a Siemens controller. And in state of the art, we build own a six own chips to have high performance, low energy consumption.

And why those boxes are relevant because it touches every you, everybody of you. What did you have for breakfast? Maybe cereal or the plates you have in the morning or the glass you drink from? Well with a nice shirt. It was produced somewhere in a factory. Oh, how did you come to Vegas by car or by plane? They've all been manufactured in factory and that factory is a complete own world and that's called operational technology.

I was entering this world of operational technology in '96. I was sitting there in Ann Arbor, um, and was helping Siemens to do the automation of the US postal systems. And I was studying computer engineering at that time and was writing code in C++ Charm. That was just on the market. I came in that world and said, oh my god, how they have programming? It's called Ladder Logic. It looks like a electrical diagram. I said, how can you program in that? But I learned what it makes sense because it was at that time very much I/O operations. And I tell you what that means. Like you have an input and you press a button.

"Oh sorry. No, you press a button and something happens. It's a digital operation, you press a button and something happens. Try to write a code in Python. Do a binary operation. It's quite hard in the PLC. It's one line of code. So it makes sense. I learned that and it creates impact when you press a button, suddenly a big man is moving something. It, I was kind of had goosebumps when I have seen that.

The question is, is that good enough today? Automation brought us a long way and a lot of the prosperity we have in in, in western nation comes because of productivity out of factories. Every country that doesn't have natural resources, basically gain prosperity through industrialization. And automation plays a major role for being more cost efficient, for being more energy efficient, having a higher quality, a faster time to market the flexibility, but that's not good enough anymore.

We need factories which contribute to a sustainable operation. We need think circular we need not only think how to produce something but how to repair it, maybe in an automated way or how to recycle it, how to get back the material. We have big issues on reshoring. In the past, there was one location in the world where you produce something. Mass production is perfect for automation because we do always the same thing without a change mass production. But now we show you want to produce very close to the consumption because you want to get resilient. That means you have smaller lot sizes, small lot sizes are normally not so good to automate because it doesn't make sense to put all the efforts in. But you need to do that in an automated way because you don't find people anymore, want to work in the factory. And therefore if you want to move a factory close baby here to, to, to las vegas to produce fruit or whatever, you need to find the people and you don't have them. So you need to automate differently. It's not good enough anymore to only make that, that you need to get much more data in which data, not only for controlling the machine or switching on the slide, you need to say that. Furthermore, on a secondary use. For example, i need to know now what is the lifetime of the battery? I need to know what is the charging situation? I need my order that battery when this battery is empty. I need to know what is, what is the status of the led i need to get much more data than only binary data.

But now the problem comes into place. This ot world is quite close, the door is locked. We don't get this data out. You are perfect in in executing, using this data making value, creating value out of this data. The problem you have this data is quite close or it's quite manual work maybe with some protocols and duty people su get it somehow out of the factory but it's very close. So the question we have how we get this data and its massive data as we have shown all the sensors are there. And it's not only new factories, the major tropic is brownfield. A lot of existing factory was quite feed buses. It's called prophet internet ip pro a lot of old things. You don't even need to know it, but you need to get this data out. And therefore we need to unlock this and imagine you get all this data and you can use this data and we make it more software defined and see what it is doing that, that controllers will also be available as a container. You can run that control which for the past, very hard to find in the future and wherever you want to do it, we need to unlock this door. We need to have a continuous flow of data from the shop floor over the edge into the cloud and back. And that's why i'm very excited that siemens, which is the number one globally in factory automation and aws as the main player are joining forces to unlock this data source.

And together we can make the data flow here very seamlessly and not going up only up in the cloud. But as bill said, oh, they push it back into production and having that continuous data flow is very important. And not only the data flow on this vertical, it's a data flow which goes beyond because in the future, you want to have transferability over the complete supply chain. So manufacture one factory, one to the next one. You want to know the product which was produced, what co two footprint was has that product in the factory, what material have been used. And you want to take that, that two print for fort for example and then take that material and then that component in the next factory and add the next value, add step and and so on and so on. And you want to have it in the life cycle of the entire product. So when the product is defective, you want to repair it and need to get this data. So we need, not only thinking virtually, we need to think over the life cycle and over the entire supply chain, we need to make this round trip.

As bill has shown it, we need to access the data. As i said, it's not so easy to get all this data out of the shop floor, which is existing. You need to contextualize data. It doesn't help you if you have the number 345 and it doesn't know what it is. Oh, it's the rotation of a motor, that motor is built into a pump and that pump is running in the factory doing this and this then you have context and then you can use this data and we want to provide this data contextualized to use it in the cloud to a lot of things bill just have shown analyzing it, get grading transparency, grading digital trends, training models versus data. And then that's very important grading impact and your only great impact if you can push it back into production.

So how do we want to do this? So very happy to announce that we are joining forces. We are combining seamen's industrial edge, which is the leading system for the shop floor to connect to all different um brown fields but also green fields connecting all different field buses from seamen and from all others in putting a container and aws side by edge on industrial edge. That container can be downloaded from the marketplace of industrial edge from cement. Nowadays already you deploy it to the shop floor to industrial edge and that opens up that universe of ot you put a pipeline there and now we can get the data contextualize data into the cloud and use it whatever you want to do and you also push it back. It's not only one direction.

And now you are very familiar how to use the data than in the cloud. But also what's interesting, i want to enable people on the shop floor that they can use this data as well. There is a problem those people are not capable to write, they are not full developers, they know the process, they have domain knowledge, but how can we enable them to use this data? And therefore also happy to announce that we use mendex, which is a which is a local environment which is currently available running on aws in the cloud. We can load local programming in the future. Now we can take mandates and also put the app directly on the shop floor and that will enable people on the shop floor to write own apps with the domain knowledge they have using the data from the cloud and from the edge and that will enable them and that will enable data usage from the ot bo in the it world.

I give you some real examples. So one customer you know that company that volkswagen and they all have brands like audi and co and others. They did that. Most of the factories run on cement equipment. So the cars are produced with cement controllers. But what they do nowadays, they produce cars and they don't use that data which is generated all the time producing the cars for secondary use. Now what they did, they collected this data and they didn't write anything new. They simply take the data out of the poc of the controller program, analyze the data and identified significant bottlenecks in the manufacturing process. Then they changed it, pushed it back into the, into the line and now they can produce a significant numbers of car more on the same line. That's real productivity. That's real energy efficiency. Using this data, not only for building the cars, analyzing them and improving it.

The second example, a company in spain is called bernard. And i was talking to mark who is responsible in that company. And he said, well, i really have a problem. I want to expand into ebus and they do food production. I want to expand into ebus. I don't find people. They are in a rural area in catalonia. And basically everybody who was living there is already working for board and they cannot expand and nobody want to move there. So here's the problem, how can we expand our operations as you don't have the people working there?

And what we did now, we we stretched the boundaries of what you can automate today. So we had, there was a lot of manual work like given fulfillment centers where you grasp things and put it in your box. That's today, manual work you can either do it by simulating the object. What we have seen that this baby synthetic data, what you do here is different. We don't simulate the object. We trained the controller with skill of grasping. So that robot can grasp any part. Even it has never been seen before. It's never trained before on that part part because not the part was trained, the skill of grasping was trained. And now imagine that that means you can in the future, maybe automate more than only logic. You can use a i and expanding automation. You can dream of maybe an autonomous factory where a machine is handling a situation which has never been programmed before, which has never been considered before.

And now we also see that a i, if you want to do that, you need to train in the cloud, you need to deploy that algorithm down to the shop floor. This can be not done like in the past where you had never change a running system, you first install it, it runs, don't switch it off, it runs in the future, it will constantly change and constantly deploy new functionality. And that can only be done in corporation with a major cloud provider like aws to making this close, close through reality to pushing automation beyond what we can imagine today. And with that, we opened, we opened the door which was locked together, siemens and aws. Thank you very much most to the there.

So let's go from the factory to a little bit closer to home. So by the way, i saw that over 1000 people have already downloaded that app and are playing with it. So let's keep doing that and see if we can overload the wi fi here. Give our networking people something fun to worry about today. But, you know, software defined homes are coming, they've been sort of fits and starts with smart home for. But now we're getting to the point where this can start to get really mature and nicely integrated. You see over 302 million smart home devices out there. And this integration of creating a software defined home is going to allow you to do things you never thought of before, automatically closing drapes and windows, changing your air conditioning, prepositioning the temperature of your house before you get home from work, all sorts of interesting things along with security and thermostat controls. But we want to continue to make it easier to do that and we want to make it easier for you to apply ml to this and predictive capabilities. And this is really going to affect how energy consumption is used. And it's going to be another path. One of the many things we have to do for sustainability.

So we've teamed up with a really innovative company called tell us which is integrating together a smart home hub with quick connect on devices and full automation for you where devices automatically connect, you've got voice control control over lock systems, control over your curtains, your ac systems, all in a nice integrated thing with a nice integrated phone app as well. Look for this soon, tell us has done some amazing work here and we're looking to roll this out globally with them. Very excited about this as a sass product for the smart or software defined home, pretty sable, like digital twins of your house software defined vehicles. That's the way everything's going these days. And companies are just driving very quickly to be fully software defined.

You may have seen the announcement on adams keynote with stan and, and bmw. We've been working really closely with them to have complete end to end software defined vehicle development testing in ada in the cloud along with simulation. The cool thing is again, the software running in the cloud that runs in the car doesn't know it's in the cloud, it thinks it's in the car, you're feeding it the information kind of like the matrix people didn't know that they were in a virtual environment and you can feed it all sorts of things, you can feed it. Like i said, millions of miles of testing, different failure rates, all those things see how it reacts in the cloud. So you end up with this very similar cycle where you have real and lots of synthetic data going into training. You have the full software defined system, you've got the digital twin, the simulation, the ml optimization. And then with things like fleetwise, we make it real easy to push it out to the car, run it in the car, then you get even more data and it just gets better and better and better itself reinforcing. So of course, at re invent, we announce more fleetwise capability"

We're adding vision system capability into FleetWise. So you can now aggregate radar, LiDAR, any of that type of data along with camera data into your car and pull it from FleetWise all for you all seamlessly integrated with the car.

And of course, FleetWise builds a nice data lake for you in the cloud which you can then take a look at any customer's car into a maker. And of course, we're working with BMW to accelerate that as well. And our partner Qualcomm to deliver the actual AI 100 accelerators on the cloud that they use in the car.

And of course BMW within the iNEXT generation vehicle in 2025. If you haven't seen this, it's, it's astounding. They're taking it to the next step. It is a software defined paint on the car. You can dynamically change the color of your car which will stop all those arguments between spouses as to what color the car is. It's going to make registering cars very confusing. I guess you put any for color. I'm not sure what you do there anyway. But it's very exciting to see this coming.

And it just gives you an idea of how things are getting taken to the next level by the innovation that BMW is putting out there. And of course, Continental also is doing this. They build their entire cockpit on Graviton in the cloud and they have a, a developer's workbench that shows the full dashboard of the car. You can even do the voice commands and everything. Test it out right on your desktop. It's running in the cloud, push a button, pushes it out to the car and then you can test it in your fleet vehicles and then you can push it to real customer vehicles. Really amazing work that they're doing there as well.

So let's get a little bit smaller, we've gone through very big. Let's start talking about molecular simulation and what we can do there material science has come a long way and large language models also have a hope here of of solving some of our big material science problems, the material science and the advent of quantum computers probably have the greatest effect on our lives longer term.

And if you saw Peter's keynote, you see what our team is doing in quantum computing and some of the things we've been working on there, but the target really is to do molecular simulation. So in in the history of computing, we literally started computing with sticks and stones, we used sticks and stones to count and keep track of things. Then we used clay tablets and paper tablets to keep track of things and writing was developed. And then we had a huge innovation with the abacus and then we use gears and things like that and clockworks to start doing computing. So this is emerging technology through the ages if you like.

And then as, as more electronics get to be understood, we started using tubes for switching and operations and and then the first integrated circuits. And then with the space program, we started, I mean at first transistor space program, we started with integrated circuits where we built RS flip flops using lithography right on the chip. And now all we're doing is moving a little bit smaller and doing the same thing with atomic particles.

So quantum computers operate with photons or electrons or neutral atoms or ions. Those are the primary technologies that are used today. So let's talk a little bit about quantum computers and see what you guys know about them. You may know more than I do in some cases because it's a pretty amazing subject.

So quantum computers use a thing called qubits or quantum bits that can exist in both states. A one and a zero at the same time. Is that true or is that science fiction? What do you guys think the answer? Yeah. So you guys listen to Peter's keynote that can exist in, in, in, in two states at once.

Let's see how our app is doing. Well, the app hasn't crashed yet. So that's good. So, let's talk about what that actually means. So, in digital computers that we're all familiar with, by the way, we call those classical computers now, just so, you know, and these digital computers that, that your digits exist in ones and zeros and you're operating on them in ones and zeros, the two magic things in a quantum computer first is a thing called superposition.

So when you're operating on these cubits, while they're in superposition, they represent every point on the sphere, they're everything at once, right? And so that allows you to represent much larger bodies of information. The second magic thing if you like around physics is entanglement. So entanglement is how we actually program the computers, we actually are creating the equivalent of a chemical bond by entangling these two cubits together. And so if you go out into a bracket today, you can build your own shots on real quantum computers and you can build those sockets, those circuits yourselves and watch how the cubits interact with each other. And that's what makes it possible to do the amazing things that quantum computers will be able to do.

But there is a problem and we have the same problem on existing computers but not to the same level. So on my iPhone and all your phones and computers, there are alpha particles flipping bits in the memory. But we have a thing called ECC or error correction code built in on our large story systems like S3 and others, we have error correction code built into that as well. So software virtualizing and fixing hardware. There we go again, software defined on a quantum computer.

We have many, many things that can infect its environment and cause it to flip its bit or its phase. Whether it's temperature pressure magnetic fields, the the ability to fab it at the proper process. So if the error rate is too high, what happens is is as you add more cubits, which you need to solve these big problems, the noise overcomes the signal and then you can't do anything with it.

So error correction is where all of the the emphasis is on quantum computers right now and they're really going to affect the physical sciences first. And the reason being is quantum computers work like molecules and so simulating molecules is where it's going to happen first on quantum computers.

So material and physical sciences and then as more cubits become available, more error corrected, logical qubits become available. It will be focused on optimization problems and even eventually cryptography. But error correction is really where the focus is and that's where our team down in Caltech is building the machines of the future that will do logical qubits and error correction.

So let me give you an example of this. So ammonia is used, you may not realize how often ammonia is used, but it's used in petrol chemicals. It's used in fertilizer. It's used everywhere. It's a trillion dollar industry. And, but we know that ammonia can be produced at a much lower energy state for a lot less money than it is today. And ammonia also has the potential to replace carbon fuels. So we need to figure this out. We know it because bacteria can do it. But if we ran that simulation on all of AWS and all of all the cloud computers and all the iPhones and all the laptops on earth, it would take longer than the history of the universe to complete that simulation theoretically. A quantum computer with about 1000 erected cubits could do that in a few minutes. That's a trillion dollar problem that you could solve with a quantum computer. And that's why there's so much focus on it.

So today we have Braket, it's equivalent to the EC2 software, it software defines the quantum computers. We have quantum hardware out there and we have quantum networking and we just announced Bracket Direct which allows you to pre allocate space on the quantum computer and work directly with the different quantum providers.

So what does a quantum computer look like? You see these a lot, what we call the chandelier is mostly the cooling and communication system and the quantum computer actually sits down at the bottom. Now, remember we're measuring vibration and spin heat is vibration. So we have to make heat go away. That's a problem. You can kind of see out of focus in the back end is one of the machines running that silver component eliminates the electromagnetic fields of the earth. And then there's a thermos bottle. When this is running, it's the coldest place we know of in the universe. It is as close as we can get to absolute zero, which is astounding to me. It takes us 48 hours from the time we fab a new chip and put it in the machine to get it down to that temperature.

But one of the things that Peter announced, he took my thunder away from my team is that we've come up with, I think the first better than break even cubit. Uh and we're not going to tout it too much until uh uh we have uh done the white paper and peer reviewed it and all those good things right out there. But uh basically, it significantly reduces bit flip, error rate and reduces face flip as well. Uh and it's what's necessary for a logical cubit. We call it a cat cubit. It's based on a pie electric oscillator. And the cat is as in Schrodinger's cat, but it's a bunch of physical qubits working together to error, correct each other. Very similar to ECC in some ways, very similar to what we do on S3. For example, we use lots of physical components to to to to error, correct each other. And this is going to allow us to scale a quantum computer. But we still, this is a huge step forward, but we still need to be 10, a factor of 10 better to scale to enough cubits to solve the real problems. So it's probably going to take us another 10 years or so between five and 10 years of working on this. And this is why we're talking about emerging technology here. But it's pretty exciting.

But in addition to that, if you have quantum computers, you need quantum networking. So in our Boston facility, we're fing diamond voids. And I think this is a really cool picture. So oo over here, uh uh i guess on the left side, you can see the physical actual physical size of the quantum repeater and under an electron micro. So you can see the photons coming off of the fiber there and getting trapped in the diamond voids. And that allows us to repeat the quantum state and you need this because if you're going to send in quantum information over fiber, you have to re enhance the photons and fiber about every 100 kilometers and you'll lose your quantum state. And this is going to be very exciting once we can use this to entangle between quantum computers or do quantum key distribution or things like that. So again, emerging technology under development right now.

So if you take anything away from this. You should be thinking about how you software to find everything you do. You know AWS is very much a software defined compute storage and network system. And that's been a lot of our success. As Verner said today, hardware fails all the time. So software is taking care of that underneath the covers, but it'll transform your business, it will transform everything you build. If you fall into this flywheel of continuous improvement, um you want to connect and collect everything.

One of the things Rayner focused on is you got to keep all that data and it's really cheap to keep it on. Things like S3 Glacier and other things like that. The more data you keep the denser, remember that that image of the matrix being built of the machine learning model, the denser that matrix can become the more accurate your models are going to be. You want to have real and synthetic data and you're going to see more and more machine learning model work on creating synthetic data for you.

Uh you want to simulate and use HPC every place you can to optimize your business along with machine learning. And you just want to create this flywheel that's always running. And our job is to make it easier and easier for you to turn that flywheel. And we'll continue to build out this picture throughout the next years until you can do it as easily as you downloaded that app on your phone. I hope anyway, with that said, thank you very much. Go out and innovate play with the apps. Look forward to talking to you guys at next re:Invent.

  • 5
    点赞
  • 9
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
图像识别技术在病虫害检测中的应用是一个快速发展的领域,它结合了计算机视觉和机器学习算法来自动识别和分类植物上的病虫害。以下是这一技术的一些关键步骤和组成部分: 1. **数据收集**:首先需要收集大量的植物图像数据,这些数据包括健康植物的图像以及受不同病虫害影响的植物图像。 2. **图像预处理**:对收集到的图像进行处理,以提高后续分析的准确性。这可能包括调整亮度、对比度、去噪、裁剪、缩放等。 3. **特征提取**:从图像中提取有助于识别病虫害的特征。这些特征可能包括颜色、纹理、形状、边缘等。 4. **模型训练**:使用机器学习算法(如支持向量机、随机森林、卷积神经网络等)来训练模型。训练过程中,算法会学习如何根据提取的特征来识别不同的病虫害。 5. **模型验证和测试**:在独立的测试集上验证模型的性能,以确保其准确性和泛化能力。 6. **部署和应用**:将训练好的模型部署到实际的病虫害检测系统中,可以是移动应用、网页服务或集成到智能农业设备中。 7. **实时监测**:在实际应用中,系统可以实时接收植物图像,并快速给出病虫害的检测结果。 8. **持续学习**:随着时间的推移,系统可以不断学习新的病虫害样本,以提高其识别能力。 9. **用户界面**:为了方便用户使用,通常会有一个用户友好的界面,显示检测结果,并提供进一步的指导或建议。 这项技术的优势在于它可以快速、准确地识别出病虫害,甚至在早期阶段就能发现问题,从而及时采取措施。此外,它还可以减少对化学农药的依赖,支持可持续农业发展。随着技术的不断进步,图像识别在病虫害检测中的应用将越来越广泛。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

李白的朋友高适

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值