How AWS and BMW Group team up to accelerate data-driven innovation

Ok, then let's start because all of you are waiting for the presentation. And so I want to say a very warm welcome and I'm really happy that you all came here.

So what we want to present you is how AWS and I'm from BMW, the BMW group team, how we team together and to accelerate our data driven innovations. So I'm together today um with Julian from AWS and also with Patrick and with Mark from BMW. And I think you can read by your own, who's who has what title?

So let's get into the context. So what we want to show you is today is a quick overview about the BMW group it so that, you know a little bit how this is set up. Then also, uh we want to give you some insights about the BMW and AWS strategic collaboration, which I will do together with Julian. And then later, Patrick will show you what have we done in our cloud data hub, which is our foundation here. And also then Mark shows what kind of use cases we have built up with our AI platform in combination with the cloud data hub so um I think you all know what is BMW, we have four brands. As you can see here, we have around 150,000 employees. And last year we sold around 2.3 million cars. And our, it is structured in 36 domains which it consists in 300 39 products and we are distributed in 29 countries. One are regional offices. But also we have set up in the last years, a lot of IT hubs or DE ops hubs, we call them to increase our internal headcount for making digitalization happen.

So one of our focus is setting up a digital platform and this is based of course of cloud. So um AWS is a big partner for us here. What we mean here is infrastructure is of course all the topics about round networks because you have to restructure your network settings in such a way that you can enable it with the cloud. Then we have our technical platform layer and on top of this, we have then build up our data and AI platform, what we are talking about. And then there's a third element which is more, I would say business domain focus, which we call functional platforms. And this kind of platforms are used by all our IT organizations here to set up applications and what we want to achieve with this, having here those platforms and using those platforms that we can create end to end value chains. And I give you a short example, we just have released now in China last this year in April our future sales model and we will release it now in Europe. Also beginning of next year, if you look on this end to end value chain, we're going along from supply chain to sales to financial services and to after sales and we have to connect this and if we don't have a common data layer and common platforms, it's very difficult for us to integrate. So this is an important piece for our digitalization journey.

So what we did was then in 2020 we set up a, a partnership between BMW and AWS where we announce our collaborations to accelerate data driven innovations. And you will see later in depth, what we mean is data driven innovations when Patrick and Mark is presenting.

So what challenges do we want to solve with this collaboration? First of all, we want to break the silence there and I mentioned it before with this FSM example as the Future Sales Model that we have to connect now more and more businesses together within our company. The second one, we also want to accelerate innovations and you can see it then later what we have done around data portals or even on improvements. And the last one is also the democratization of data. So we use it in at scale. So where we are leaving in that we should get in every team, a person at least who can handle data and look into data and get insights out of this. And nowadays, we are on a level around 20,000 people who are accessing this cloud data hub as a basis for their work.

So what is our collaboration? And I invite now Julian to continue here. So what is our collaboration and how does it help to tackle the challenge that Alex just mentioned?

So our collaboration, our partnership is structured around four pillar. So the first pillar is our partnership business office. So the partnership business office is our joint governance. So this helps to take a decision about where to invest our time and resources to maximize the outcome for BMW leveraging data and AI. So this is more than the traditional PM office that you could see sometimes in etcetera program. This is really like true partner BMW AWS, taking joint decision about the future of this collaboration and partnership.

The second pillar obviously is our platform enablement. So we have two main platform as part of this partnership. So the first one is our data platform Cloud Data Hub CDH. You will hear a lot about it during this presentation. And the second platform is our AI platform that Mark will present a bit later. So the Cloud Data Hub CDH is really the foundation of our partnership because it enables data driven use case as part of our collaboration. So for that, we are two very strong team for BMW that we are developing the platform uh supported by our AWS professional services. But it's not only these teams that collaborate, we also directly the service team, for example, SageMaker S3 Glue LA Formation that were directly involved into the platform foundation, making sure that we prioritize feature that BMW will leverage later on.

Well, and as he mentioned already that we have to team up here. And one of the success factors of our collaboration was definitely that we have joined use case teams. So it was not a classic role about we as the was it the OEM then talking to our service provider AWS here we really joined as a team here. And sometimes you could not see if there is a difference or who was from AWS or who was from BMW. This was very important, also working directly with the business here on use cases.

And finally, what was very important was training and enablement. And I said it already before we had hubs everywhere and we have 29 locations worldwide where the IT is and this was a very important thing to enable those teams there using this platform because it does not help if you are creating the greatest platform and no one uses it. So this was a real big topic for us to make this enablement happening in China in South Africa. In Portugal, all those places.

So key success factors was also that we have a top management sponsorship from both companies and of course acting here as one team. And what I mean also is a top management sponsorship was also we did, for example, then at the beginning of 100 day boost where then we had to report every two weeks to our CFO and also to the head of the board member of R&D. And that helped us, for example, breaking silos getting data really released that we can get it into. This was a really important topics there, getting this management attention.

And another success factor is that we truly work backward from the the business requirements. So we had this list of the top 20 most important new skills that BMW BMW wanted to tackle uh leveraging data. And obviously, we also had to measure the success. So this is where our partnership, business office was very important. So we were challenged every day. Are you providing the right value? Uh is it the right metric that you are providing, is the business case up to date? So we challenge ourselves both company and learn from each other as part of it.

And the last success factor, we also wanted to share with everyone here. And this one is a bit special because it came later during our partnership from a question that we had from the BMW board um asking us, ok. Now you guys have developed 2030 use cases. But how can we scale that? 200 or even thousands of use cases that the business needs? And this is where the concept of a flywheel come.

So maybe I'll just ask a question to the crowd here. So does anybody know the flywheel from amazon.com? Are you familiar with that? Can you raise the hand? I see, at least a few. So I'll explain very quickly. Um so it came from uh in fact, our founder Jeff Bezos and I mean for the legend, it said that he wrote that uh piece of paper on a napkin to explain the growth of the amazon.com uh business. So if the way to read that one is to say, ok, you start with customer experience, customer experience is quite great customer experience, you will generate more traffic, more traffic will bring more seller, more seller will bring an additional selection. Again, customer experience will be improved and it kind of spin the flywheel and help the business to grow while the business is growing, you will also lower your cost structure. This will enable you to lower your prices, making your customer even more happier because we know the customer like to pay less uh that will never change.

So this is kind of a concept that we see was happening during our partnership. So we try to replicate that a fly wheel from jeff here on the screen and the way to to read ours is to start from the left. So we have a very um rigorous selection of use cases that you see on the left. So each use case, our partnership business office was challenging usa re we doing the right use cases, then we deliver business value out of this data and this business, these use cases generated new data, new capability to the platform which unlock new use case. So this is where our uh flywheel was spinning, new use cases, new data, new capability, leveraging the new use cases.

In addition, like Alex mentioned, our workforce was also enabled, then they would be able to expand the CDH feature. AWS was also releasing new feature which again create new capability to the platform. So this is where we see our virtuous circle from from the partnership.

So i mentioned already a couple of times um that we were very successful, but we are a data driven company at, with BMW and at AWS. So a couple of numbers from our partnership, so we call it, we could deliver 59 use cases that leverage data over the last three years. But i would say more importantly, there is more than 1200 use case that leverage now data from the cloud data hub. And this is where really we were able to scale thanks to the cloud data hub and the enablement and you see that there is more than 20,000 users also leveraging the power of data from business analyst to developer to engineer.

So i mentioned already a lot about cloud data hub and i'm sure you are, you are very curious to know more about it. So i'll now hand over to patrick for dive deep on the cloud data hub.

Thank you very much. So i would love to take you now on to our journey of how we have been creating the cloud data hub and how we have been able to generate €1.9 billion business value over the last three years before we jump on to the actual solution.

I do want to share with you a bit of our challenges that we had previously on premise. So we had an on premise data lake. And as you can see here, many teams have been involved and here we had basically two challenges on the one side, organizational challenges as well as technical challenges and on the organizational challenges, there was a central in just team as well as a central data engineering team where their resources basically got prioritized based on the use case needs. This works pretty good at the beginning. But as soon as the data lake becomes more and more popular and more and more use cases wanted to build on top of the data lake, there was a huge prioritization process going on to really prioritize the resources of our team as well as on the data engineering team.

On the technical side, we had also challenges around shared resources. So everyone who was working on a data lake has been using the same computational resources. And that ended up basically in a situation where when you have a use case that has the need of huge computational power that they had to run their use case throughout the night, limited tax. That was also a challenge for us. Basically, we have been limited by our initial choice of technology and upgrading our stack or using the newest technology was really a challenge.

So in very short sentence, basically, we have been limited by our initial choice of technology and upgrading our stack or using the newest technology was really a challenge. Last but not least as the data lake becomes more popular. It was also a huge challenge for our operation team to make sure of that the data laak is operational as well as that we are able to scale our solution with our business needs.

Then these have been the driving factors for us to to come up with a new solution. So basically the cloud data hub and here when we started off, we had basically two goals in mind. So on the one hand, we want to have an innovative freedom so that we as a company are able to innovate. And on the other side, we also wanted to have a strategic standardization so that we are not reinventing the wheel over and over again.

And here we got inspired by two main concepts. The data measures approach as well as the data fabric approach. And the data measures approach is a decentralized approach where you typically aim for a domain driven ownership of data, where the individual business units are the ones who are owning the data and also need to create data products for the whole company. The data fabric approach is looking on things from more central perspective. So providing an integrated layer of data that allows data access and data sharing in a very central manner.

So now let's see from the theoretically part where we have been applying this concept on our cloud data hub

Patrick: So on the left side, you see the data providers and on the right side, the data consumers and here we are applying the data mesh approach. So here we want our colleagues to enable that they can use whatever tool is available out there and fits best their job, right? And the data providers in that sense are the ones that are that needs to do the data ingesting as well as building data products. And they are organized inside of our business units at BMW.

Once they have built out the data products, the data consumers are using these data products for building the use case on top of that. Now coming to the central parts of data fabric approach that we have been applying here. So in the in the central part of the slides you can see the data portal as well as the cloud data hub and we have been centralizing our data storage as well as the governance.

So here, the providers as well as the governors are really working hand in hand to make sure that data is created in a compliant way and that we are following all regulations that do exist out there. On top, we are also providing building blocks from a central perspective for the providers as well as for the consumers.

And maybe you think now, ok, there's a contradiction, right? On the one side, we want that the data providers and consumers are free to use whatever they want. And on the other side, we are providing building blocks, right? And what we see here is that even though that we are not forcing our users to use our building blocks, they are using them. And the key here is in my opinion, once you are building the right things for your users, they will just love to use it, right, instead of forcing them to use the solution that someone from a central perspective is providing.

So now let us dive into the engine room a bit. And here I want to be sharing with you our journey on how we came from a data lake. So we also creating CD as the first data lake towards lake house architecture. And in very short, what is lake house architecture is basically the combination of a data lake and data warehouse solution.

So on that slide here you see the first iteration of the cloud data hub. On the left side, you have a variety of data sources that have been ingested throughout data providers basically on S3. And here we have been using Apache Key as the main storage format for all the data that is more on a structured that are structured. And in the glue catalog here, the tactical meter informations are represented for the data that are residing on S3.

On the right side, you see a variety of use cases that are spinning from analysis towards machine learning and reporting that are using the data on S3 either directly or via Athena that works pretty good at the beginning. But at some point in time, we realize that there are more use cases that wants to have more data warehouse like approach.

And here we have now decided to change our storage format to Apache Iceberg. And Apache Iceberg is basically offering you the data warehouse capabilities like it would be a data warehouse. But on S3, so you have the advantage of the scalability that S3 is offering you. But at the same time, at the same time, also being super cost efficient.

And at the same time, once we have decided to go with a patch iceberg, we have seen that we can also reduce our processing costs by up to 70%. Now you ask yourself potentially how can that be possible? Right. And the answer is pretty simple. We had a lot of use cases that do require to have always the latest state, late state means for example, kind of is the window of a car open now or not. The history is not that relevant for some use cases, right? But to generate that later state with a patchy iceberg, our colleagues had always to use a lot of computation resources because you had always to rewrite entire files.

Now with Apache Iceberg, that's not necessary anymore. And by that, we can then reduce our cost on the processing side for generating these cases. Also, for example, when you have a role that you want to be deleting on a data set, right? It's now easier, right? You do not need to re write entire files, but you can just do that with iceberg in one single comment.

Nevertheless, there are also use cases that do want to have that extra layer of performance. And here we have been introducing native data warehouse capabilities so that use cases that want to access data very fast. They can do that. But super important here is we do not want that data gets copied from a to b but that data should be remaining on three as a native storage. And here the data warehouse solution is retrieving these data from S3 and data remains there as well.

The big advantage is here that all the use cases that you can see on the right side can still or can decide either to use the data on three directly or via Athena or using the data warehouse solution for that acceleration of carrying the data.

So now coming back from the engine room towards data portal and the data portal is the one stop shop at BMW where everyone can log in and see all data and work with data. And here we have decided that we want to enable that whole end to end user journey within the data portal. And the typical user journey starts off with the discovery phase.

So you want to find data but you also want to explore data. So you really want to see how is the structure of data, right? And here we are offering the ability if compliance allows so that even without access, you can already get a preview to the actual data to decide is that data set, the right one that you want to be requesting access to what we are also providing here our capabilities to to understand the data.

So data lineage so that you have an understanding of how data got prepared until it end up in a data product. And the next phase is then the preparation of data. So once you have found in the data portal, the right data that you want to be looking at, you can request access directly via the data portal.

So there is no need to jump from one portal to another for requesting access, but it's everything in one single place. What we are also offering is the ability to manage data products. So the data providers have all the capabilities to manage their products. So they have the ability to provide a description, create data quality metrics and so all capabilities that you need for managing your data products in one single place.

Last but not least here i talked before about building blocks here we providing also from the data portal, the ability to ingest data, right. So you can ingest data via the data portal into the cloud data hub. And what happens here, you define the parameters and then your job gets deployed into the provider accounts, right? So you can use these building blocks even without technical knowledge.

Last phase is in the creation of value. And here we are also providing building blocks for consuming data, for example, via jupiter notebooks where you want to be consuming data via code inside of your use case, but also the ability to create visualizations, for example, via quick site. And that's what i want to show also later on how that feels within the data portal.

Last but not least, we are also offering the capabilities to apply a i and machine learning on top of the data where mark will be diving deeper on in his presentation.

So now let's have a look on to the data portal, how it feels as a bmw employee to find data. So everyone at bmw can log into the data portal and then you find directly a google like experience of looking for data, right. The other option is you can also go to the data catalog and here you have the ability to apply filters and browsing data products and looking for the right data product that you want to be using for.

Once you have decided that that's the right data product you want to be using, you can select, for example, here, the road traffic data and here now you get a glimpse on what information someone will be seeing for data product. So it's really all about descriptions, the responsibilities, the data lineage, the consumers, the data quality and and end and here you can then afterwards when you decide that's the one that i want to be using for my use case directly, request access also via a couple of clicks.

So what we have now seen, we have been able to break down our data silos efficiently, right? So once you as a company have achieved that the next step will be that you end up being in result silos, right? So results gets created inside of use cases where you typically do not have transparency then afterwards and how we are tackling that is as follows. Also with the data, partly have the ability to see use cases, right? And here everyone across the globe can see what use cases to exist and look for inspiration.

For example, we are jumping now into one use case that is creating a result in visualized form, right? And here we are offering the ability that once the use case owner decide to share his dashboard that anyone in bmw can see the results from that and that's super huge, right? Because now you can get inspired by results that others are creating and you can use these results in your processes or make decision out of that without reinventing the wheel over and over again.

And that's how it looks like when you click on one of these dashboards, you will see them directly, the results here, right? And that's, that's super great. That's the next step where we are breaking down the result silos.

So now you want also to see some facts and figures, right, kind of to see what's behind the scenes, right? And here are a couple of metrics and kps that we have been collecting. So what we see is about 30% of data we use across divisions. So business divisions and that really demonstrated how we have been breaking down these silos and how business divisions are collaborating to use data from different business units to create insights.

At the same time, we have also been able to generate 1.9 billion business value in the last three years. And like alex also said, we have around 20,000 users who are actively using the cloud data hub. In the middle part, you see a bit of more technical kpis for there are around 7000 data sets and 2 2500 aws accounts that are really demonstrated that data mesh approach, right, that you really have different aws accounts to collaborate on the cd.

The below row is showing that we are using heavily service technologies. We do not want to reinventing the wheel. We want to focus ourselves on the differentiating factors instead of doing the heavy lifting.

Great. Now we have breaking down data silos, we have been breaking down result silos, what's next for us. And the biggest challenge for us is how do we enable our business users that they can use data inside of their processes? And here is how we envision how business users will be able to work with data within the data portal.

You find maya maya is our a i assistant where you have the ability now to ask maya questions or let maya do analysis for you, right. So here, for example, i do want to know how many electric vehicles we have been ordered or got ordered from 2022. And mayer is now selecting the right data set. Does the analysis shares the as well statement for building trust and then afterwards you're getting also the results directly.

So and that's, that's kind of really where business users now can be doing analysis without having a techie on their side. And i guess that's really the fundamental step forward for us to really applying generative a i on top of the data lake that we have or c and will help our business users to become fully data driven in their processes.

So now let us see the result of our hybrid cars, how many hybrid cars got ordered since 2022? And we have the result and also the visualization that you now can share with your colleagues and get inspired inside of your business processes.

And with that i want to be handing over to mark who is diving into our a i platform. Thank you very much.

Mark: Thank you Patrick. As we've seen a i will change the way and how we access our data in many fields. Artificial intelligence is innovating our business processes. In order to create those a i solutions, we need to run machine learning with the data on these processes. That's why availability and quality of this data is very crucial.

So let me share with you some of the challenges we had before we had the cloud data hub at BMW group. So imagine way back when our data, scientists started a new artificial intelligence use case without cloud data hub, they teamed up with business to understand the needs and requirements and the use case. And then together they went on a on a hunt for data.

So they talk to different domain experts to find out which information exists in the enterprise. And then they approached the it application responsible to get the access on the data. Then there was a lot of back and forth between different responsible in the company to grant this access. And then the data science team had to work with different interface technologies in order to access the data and bring that to a central location to work with it.

They need to do a lot of data preparation because the data was coming from life systems. They need to tackle situations where there were not enough historic data to work with and they were lucky the data was already on a data warehouse and it was prepared. But most of the times they need to do all this work on the use case. The whole time in the use case was dominated in finding and preparing the data for artificial intelligence.

Our cloud data hub changed this entirely for a i use cases. Now today when a new use cases started, it's registered in the data portal that Patrick has shown before. And then on the basis of this use case, the data scientists can explore the available data assets of BMW group in the catalog. They understand the attributes that are available there. They see the documentation which is linked, they even see other use cases that work with this data assets and they can exchange with the colleagues on these use cases as well.

They can request access in a central location on the data assets. And there's a process across the organization on how to grant us access for getting the data for for machine learning. There's a unified way on accessing the data.

So no need to work with different interfacing technologies. So this whole setup helped to reduce the time to access find data for machine learning a lot.

So let me illustrate how this worked on a specific use case. I brought a use case from the early days of our collaboration with AWS. This use case contributes to the availability of our products for our customers because we at BMW, we offer the power of choice to our customers. And a strong element of this power of choice is to build your own BMW.

For every vehicle model that we are offering, there's a variety of vehicle options that you could add to your vehicle. And by doing so you can customize the vehicle that you're driving. For example, for our BMW i5, you could add the rear and front heated seats, you could add the iconic glow kidney grill.

For example, let me see hands. So who who have you already like went to the online configuration and try to configure their own BMW actually bought one. So there are a couple of hands. Yeah, so you know about this this process.

So our customers expect this power of choice, of course, but at the same time, they are also expecting the timely delivery of their vehicles. They don't want to see that the delivery time is like moved far to the future just because they selected some special vehicle options.

So this adds a whole dimension to ensuring the availability of our products and to manage this availability. Of course, we need to plan ahead. We need to plan the customer demand in order to break it down into the part needs that we have and agree on procurement contracts with our suppliers so that they can prepare their production.

And we need to do this before even the start of production of a new vehicle model, we need to be able to derive the demand and agree with the supply chain volume.

So how did we do this in the past? There was an expert based planning process involving regional experts who had the knowledge on their region, their demographic development, the understanding of past sales there and they had to build the understanding on the new vehicle model, its features, it its pricing and come up with a planning.

So for example, a planning expert for the market, Germany had to go in the planning system. And for the BMW i4 had to plan for each vehicle feature. The expected take rate. The take rate is the amount of vehicles that we expect to be sold with this particular vehicle option.

So for example, in this case, on the right. You see that the class roof is planned to be sold 69%. So this had to be done for every variant of a vehicle model. So every engine variant for every region for every vehicle option.

And if we planned a too low volume of vehicle options, we had challenges with customer availability of vehicle options. If we plan too high, then we had remnants costs with our suppliers because we are not basically consuming the parts that they are delivering and the planning effort was high.

We wanted to innovate and have an automatic forecasting solution for this process. And this is where we started in a AI use case. And the data scientists teamed up with business and they went to cloud data hub and already found a lot of data assets that were relevant.

For example, all the historic sales on vehicle options and vehicle models, which they could use information on vehicle models and vehicle options. Other information had to be added and there were processes in place to add those data assets to cloud data hub.

And after an intense exploration phase, they could find an artificial intelligence model that work that could predict for new vehicle models based on the experience with old vehicle models, the customer demand. And we could verify that of course, because we were using data from the very past to predict the outcome in the recent past and validate the model.

Then we teamed up with AWS professional services to industrialize the solution and integrate it in the business process. And the solution consisted of a pipeline to create the artificial intelligence model by importing the data from cloud data hub, preparing it and executing the machine learning part to create a boosted tree model that is able to predict the take rate.

This model was then integrated into the planning system so that the regional expert would see the forecasted take rate per vehicle option. We call this the semi automated planning with human in the loop. And this approach for AI helps to build trust when you move from a manual process to automated one.

So that you put the human in the middle and see the the input from the AI and has the ability to override, has the ability to understand how the model came to this value. This whole setup, we have it in place now for a couple of years already was applied already to the launch of multiple vehicle models successfully.

It helped to improve the availability of our vehicle options and reduce the remnants costs and solutions like this made our business excited. Of course. Yeah, they see the potential with AI and they wanted to have more in their processes.

But we asked ourselves um looking at the work we spend in those use cases, how can we improve? How can we further scale? Because the effort for the industrialization still was quite high. And we've seen a lot of use cases. They were built on AWS where they had similar needs, but different solutions were created.

And that's why in the partnership, we established the architecture stream to find those best practices and solutions in the artificial intelligence use cases and extract it. For example, there was a solution how to monitor the data quality regarding expectations that have been created. When building the model. In the first time, those best practices were extracted and generalized for reusability.

And this was the starting stone to create our MPS platform. So whereas cloud data hub provides the access to data assets across BMW group and a way on to consume those, the MPS platform provides an approach to manage the full life cycle of BMW groups AI solutions.

This is very important to have a best practice to have safe AI and be able to reproduce the model training. Because when operating AI solution, you need to be able to react and update on new data. So that's why our solution consists of three parts.

First part is about preparing the data. So it has capabilities to import the data from cloud data hub and work with small and big data as well. Combine different data, data assets, aggregate them and prepare them for machine learning.

Second part is about model training. Solution is flexible to support different machine learning frameworks and integrate the best practices to validate the model that has been produced, integrate mechanisms so that business can approve new model versions and version the data and models as well in order to reproduce them later.

And in the third part, the MLO solution provides capabilities to be able to deploy a new AI model version or to roll back to an old one and monitor it. There are different mechanisms in there to integrate AI with business processes and a mechanism to monitor the AI in place in order to update it. when we started with the envelopes approach, we basically looked for a way to replicate good solutions to new use cases.

And that's why we came up with a reusable template and shared infrastructure modules that were scaled to new use cases. With this, we could scale and ramp up new use cases fastly. But the use cases still need to take care for their infrastructure in the cloud.

That's why we moved on and created a managed service. And for us, this managed service in a setup, we are able to create AWS accounts for the use cases that are centrally managed. And in those AWS accounts, the infrastructure for machine learning and for model deployment is already set up so that the use case can focus on their actual use case specific parts, we can run updates centrally and manage the security for those use cases.

This drastically reduce the skill need for the use cases because they can mainly focus on the integration on the machine learning part, on the data preparation part, but they don't need to take care for the whole infrastructure set up.

But MPS is just one service for us to scale AI across BMW. From a platform perspective, as Alex coined in the beginning, digital platforms are very important to empower our business. And for AI, we are ramping up a service portfolio to support the sustainable implementation and operation of AI and integration in the business processes.

The service portfolio, we grouped into six service clusters and they serve different different aspects with AI business services. These are ready to use AI based functionality that can be directly integrated into business processes. For example, there's a service to translate texts by an API. There's a service for intelligent document processing to extract information from documents.

Then with data science workbench, we are offering the solutions for our data scientists to create new new AI models. We're having a lot of use cases that use with that work with automatic machine learning there to create AI models. But we're also offering a code centric data science environment for machine learning based on AWS SageMaker studio and all the models that are produced.

There have this path to be industrialized to an AI solution using the MLS platform services. Then with data labeling, we are offering services to store and manage unstructured data. So for video data acoustic data. In addition to the cloud data hub and offering services to be able to label the data centrally.

Then with conversation AI, we are providing a, having a platform to implement and operate digital assistance with AI governance. We take care that all the AI use cases are registered centrally. Their models are stored in a central BMW group model catalog and that we are able to run risk assessments for those AI use cases which is very important regarding the upcoming regulations.

For example, in the European Union with the Artificial Intelligence Act. So AI is becoming more and more part of our business processes. So looking at the main processes at BMW group, let me give you some examples how AI helps there already for product development AI helps us to design the vehicle models and vehicle features that our customers desire.

For example, there are use cases that you say I to derive the control levels that have an influence on the performance of our vehicle functions. For example, in this case, the sound that we have with the vehicles outside and inside while driving. So what are the control levers there? AI models help us to understand that for supply chain and logistics, AI helps us to optimize as we've seen in the option.

Take great forecasting example, we even use AI for our autonomous logistic robots that contribute and logistics in our worldwide plans. And then in vehicle production AI helps us to achieve a high quality in our vehicles. Alongside our production lines, we have a lot of cameras, acoustic recordings and other sensor measurements where hundreds of AI models are working to assess the quality and support us for quality assurance and in sales.

And after sales AI is helping us to innovate the interactions with our customers and for our workforce. We are using AI for example, to support them to find the right information and ask the AI on corporate knowledge.

So I'm really looking forward to see this growth in BMW group and to be a part of it. So wrapping up, I hope he could inspire you with our journey on innovating data driven innovation at BMW group together with AWS.

You've heard from Alex, how digital platforms are important to scale digitalization alongside our business processes. Then Alex and Julian shared how we established the partnership and moved on to prioritize and implement use cases to drive innovation.

Then Patrick shared the story, how we established the cloud data hub as our central data platform to fuel all these use cases. And then we were sharing the story how we went from first initial artificial intelligence use cases on cloud data hub to best practices to a service portfolio of an AI platform.

So I hope you are curious now to see more of this, there are like two opportunities i would like to share with you. So this evening in Venetian, you can have a look at an at an analytics use case how we managed during the semiconductor shortage to analyze our demand on semiconductors using information in cloud data hub.

And then tomorrow there's another session for you where we explain how we integrated AWS QuickSight into our data portal to streamline the business intelligence directly in data portal.

Thank you all for listening.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值