Smarter, faster analytics with generative AI & ML

And by all account, we are only scratching the surface. My name is Kinshi Pare with me is Kelly Burden. And today we are going to talk about application of generative AI and machine learning on analytics.

So with the agenda, we are only gonna spend under five minutes talking about defining generative AI modern data stack. But we are going to focus most of the presentation on all the announcements on diving deep into all the announcements that you have heard in the last few days.

More importantly, we have not one, not two, not three, not four, but five demos across the analytics products. So stay tuned.

So generative AI, right? It's a subset of artificial intelligence. And while some of the artificial intelligence had been, you know, traditional artificial intelligence had been focused on, you know, pattern recognition and forecasting generative AI generates new content, sometimes original content. These models employ machine learning algorithms and they are trained by using large amount of data and large amount of compute. More specifically the GPU compute.

Now AWS is the best place to build a data strategy to fuel your generative AI application. It has the most comprehensive set of capabilities, whether it is around storing querying, processing and um you know, doing business intelligence, it's well integrated, you have options around, you know, integrating data using ETL ELT zero ETL. And finally, it's, you know, you it has governance built in governance support across, you know, to, to ensure that you actually have responsible AI.

Now, this is the end to end data foundation here, you know, you can ingest data using data integration service like AWS Glue, you can ingest data from a wide variety of sources and then you can store query and analyze it using a variety of processing engines, store it in a data warehouse using Redshift or into data lake. And finally, once you have the curated data in data warehouse or data lake, you can use, you know, SageMaker QuickSight depending on your use cases for analyzing building machine learning application and so on.

So this entire presentation we are going to start from the left hand side which is data integration. We're gonna look at data quality checks, then we'll talk about data governance and data storage into data warehouse like Redshift. And finally, Kelly is going to talk about QuickSight and the business intelligence part.

So let's start with data integration. AWS Glue is the serverless data integration service. It allows you to, to connect to a wide variety of data sources. SAS applications, data warehouses, databases. AWS Studio offers a variety of authoring interface. You have the low code, no code interface. Um you can actually use the DAG define it completely visually or you can use the wrangling interface which allows you to express your data transform in terms of recipes.

The second option you have is with the Glue Studio notebook. So if you prefer authoring your data integration jobs using code by Spark's Scala, you have the option to spin up a Glue Studio notebook and author your jobs that way.

Now CodeWhisper allows you to build application faster. It's your AI coding companion that allows you to author code across a variety of languages and and do it in a, in a very rapid manner. This year we introduced we integrated CodeWhisper with Glue Studio notebook. And here the idea is that you will be able to generate code for your data integration pipeline for complex transforms that you know that are difficult to code. Uh integrating it with Code is whisper. You can actually uh automatically generate that code.

So next I'm gonna show you a demo of how this works before you can use it. You have to take care of a small step which is uh give permission to the role that you use in an old book. So it can use CodeWhisper. You can just add an inline policy to your role, search for CodeWhisper permissions and choose the permission to give recommendations. And that's it just give it a name and you can add it to the role and now you can use it.

So I have created a notebook configured with the role that I was authorized to use. Go whisper. So now CodeWhisper is running and is waiting to help me. So I also started a session with the common configuration and now I'm going to show you a few examples of how CodeWhisper can help you. Ok?

Let's start building sample dataset so we can play with it. So I'm gonna ask it. Ok? So once I enter, it's going to start offering me suggestions, you can also trigger it itself. Um let's pick this one. So I click tab to accept that. It's going to keep offering more examples until I finish line and then is going to enter asking me to offer me to put the column names for the data frame. Ok? This looks good tap to accept and that's it. So I have now my data frame. It's now suggested if I want to convert it to diameter frame, not yet.

So I'm going to say now um count uh unique values p. So it's going to give me the code i can just tap to accept and enter to run the cell and telling me that yeah, all the values are unique.

Let's now do something a little bit more practical. Let's say I want to add a column with the total price for each order. So it is a little bit vague, but the algorithm is able to figure out correctly. Let's print out to see the result, check that it's correct. So yeah, it's multiplying the columns and it looks right to wrap up.

Let's imagine that you just now want to say data frame smart files on the street. So it's gonna suggest me the line that i need to use. I just have to put the path. Um and that's it. So in this short demo, you have seen the possibilities that artificial intelligence through CodeWhisper can bring and how it can help you be more productive when writing your goal. Thank you for listening before you can use it.

Now. The second biggest second important aspect of having the modern data stack is having a high quality data. One of the biggest challenge is that, you know, having access to quality data is super important for every single downstream application that you are building, whether you are building a business intelligence application, whether you're building a gen a application, they are all relying on making ensuring that you have access to quality data.

Now, traditionally, what you do is you define the rules when it comes to, you know, data quality checks. Possible examples of rules are hey for this column, the value should, should stay within this this range or the value should not exceed a certain value or you know, it should not have null values or it should have only have integers and so on. But those are not enough because the biggest challenge is that your business environment changes the the, you know, the thresholds that you have set up today may not be relevant tomorrow. And though that means that, you know, for you to actually constantly having to update your data quality rules.

So this week, we announced anomaly detection and dynamic rules. This is a machine learning based algorithm that we have built with AWS Glue where you detect the anomaly based on, you know, uh historical analysis and it generates insights um which allows you to actually create dynamic rules. Um and I'll show you an example of what dynamic rule look like.

So in a static rule, as i mentioned, you have fixed thresholds with dynamic rule, you can actually make it more interesting. You can say, you know, the rule is that it should be greater than last 10 days of average of 10 days plus minus two standard deviation. And what happens is that when the business environment changes, the rule stays consistent.

So the key idea here is that you don't have to manually go and keep editing and updating rules based on, you know, changes to your business environment or change based on changes to the seasonality. The way it works is very simple. You apply, create data quality rules. We like you just specify a column, we gather data statistics and the machine learning algorithm automatically generates recommendation.

Now you can also stop there. If you don't want to actually create rules, you can say just generate alerts if the value changes or the deviation is significantly higher. But obviously the algorithm can also intelligently recommend dynamic rules which you can apply for your data quality checks.

So this is the second demo where we are going to talk about data quality.

Hello, i'm Alison solution architect at AWS. I'm going to show you how easy it is to enable data quality on the data pipelines you built. I'll show you how AWS Glue learns from your data to identify anomalies that cause data quality issues. Therefore helping you to build and refine your data quality rules progressively.

Here i have a new york taxi data set data pipeline. It takes this data on a daily basis, aggregates it and loads it into another s3 bucket now to enable data quality checks. I simply add the data quality transform. I really have no time to configure data quality rules now. So i'm going to use the new ML capability to help me build rules. I can evolve later for this.

I'm going to add an analyser called row count analyzers, gathered statistics and detect anomalies in your data. This is going to gather our row counts, learn from the patterns and start detecting anomalies. I've run this job a few times now for each new data set and glue has learnt the patterns of my loads.

I've also referred to the documentation and have set up alerts to let me know when new anomalies are detected. Now the disaster day comes, our data file has low volume without this detection capability. My glue job would have just run fine and later i could spend hours debugging an issue with glue data quality and novely detection capability.

It's easy simply go to the observations tab and notice that glue data quality has already generated an observation about the abnormal drop in row count. You can also visually see this from our graphics. Now i'm interested in monitoring and even stopping my job. When i see this abnormal row count for this, i am going to apply the rule i just created in my data pipeline by clicking on apply rules.

Then i'm going to convert my rule into a dynamic rule with this rule. My row counts will be monitored with a dynamic threshold that will look at the running average of the past 10 executions with glue data quality. You can combine the power of machine learning and data quality rules to deliver high quality data for your business users, enabling them to make confident business decisions. Thank you.

So so far, what we have done is we have looked at the injection pipeline, we are at the raw data to curated data stage. We have done the data quality checks.

Finally, this is a new announcement that we made yesterday. It is end to end data integration job authoring. So one of the bigger you know, with the generative AI, what you can do is you can actually just describe your intent and it will generate your entire data integration pipeline. You can express the intent. For example, you can say something like in just data from three join, on column account id and save data into a tree using parca. And that will generate your end to end ETL pipeline that you can review and deploy. That is the new announcement that we made as part of you know in Swami's keynote.

Ok. So now we have ingested the data. Now we are going to talk about data warehouse.

Now Redshift Query Editor V2 is if you know SQL based free web based SQL authoring interface. This year, we added a notebook capability onto the Redshift Query Editor V2 and it allows you to actually create schema load data and and look at your data set holistically wit Amazon Q.

Now you can alter SQL queries based on natural language prompts. This is this significantly increases the productivity and allow you to author complex SQL based on the data set that's already existing in your system. And it continuously improve your accuracy of SQL transforms how it works is Query Editor V2, send query context and it receives SQL code suggestion that you can copy and deploy into your notebook and execute those. And we are gonna just show see this in in in the demo.

So here you have the Query Editor V2. This interface should be familiar and we are just gonna add a notebook tab with the notebook tab. You can click on generative sequel. We're just gonna open a chat interface

Now on the chat interface, you can just express what you want to achieve in natural language. In this case, we're just gonna show birthday of the user who bought most number of tickets. You don't have to specify table database, any of these things. And now it's generating a response that you can add to your notebook.

Let's look at a second example which is even more abstract how many tickets did he bought. And again, it's gonna generate the sequel which you can automatically add to your notebook and finally you can review the generated code and click run.

This is another announcement. This is one of my favorite feature um of um of recent announcements which is an a i driven optimization. I like to think of this as an automatic tuning based on, you know, your data based on you know, query complexity and concurrent users here red shift will automatically scale to support the intent. And it is, you know, with the a i enhanced optimization and forecasting, it automatically organizes the data beyond, you know, which is better than beyond traditional encoding.

It's enabling, it is very simple. It's just a two step process. You create a preview work group and then you adjust price performance toggle, you can optimize it all the way to the right which is for the performance or all the way to the left, which is optimized for cost.

Ok. So we talked about integrating data, we have got the data into data warehouse. Now, let's talk about governance. Now, governance provides guard rails to innovate faster with you know, for any application, you know, the good intentions don't work. You need to have the right mechanism to ensure that you can uh users can develop their applications, share the data and have you know the right policies, the right security policies that are protecting, protecting the data.

Amazon data zone allows you to achieve just that you can manage organization wide governance policies. It simplifies access for analytics, it catalogs the data, you can also annotate the data and you can it's all sort of specific business cases for business users. But the biggest challenge is finding understanding and documenting data. It is a complex task. It's a menial task. It's not a very interesting task. And to address this, we have announced a recommendation for description. We're using the power of generative a i. You can generate detailed and contextual description for the data and it is very, very easy to enable all you have to do is find the data and you know click on generate summary and then you can choose to edit the summary or you can actually choose to accept the summary and finally, you can accept all the suggestion, edit it based on your uh you know, based on your insight and accept it.

So we've gone from, you know, taking your data, integrating it, performing data quality checks, storing into red shift, having a perfectly optimized or an optimized way to actually store it, catalog it and then end user productivity using generating sequel and finally generating catalog and governance metadata using generative a i.

Now I'm going to hand over to Kelly Burton, who's gonna talk about quick site?

Great. Thank you, Ken. I have a distinct pleasure of sharing with you how we're going to empower the business with our a i capabilities. As Kensho mentioned, we're going to round this out talking about the business intelligence piece, business intelligence at aws is about amazon quick site. Amazon quicks site allows you to create modern dashboards to do machine learning such as forecasting anomaly insights and also ask questions via natural language query. That's all been there prior to this week. We have patted reports. So thank headers, footers full table, print multiple page through your traditional bursted reporting as well as supporting a number of different kinds of embedded analytics. We can embed visuals dashboards as well as the actual authoring experience in an external web page. So that's been there prior to this week.

Um we're excited to announce some generative b i capabilities this week and we talk about generative b i built upon amazon q and quicksight, we're thinking productivity gains, how do we empower business users to be more efficient and more productive? So, generative b i with amazon quicks site is built on the a i capabilities of amazon frocks large language models. It creates a new natural language experience that delivers reliable analytics through quick sites, analytics engine. What that means is we're empowering business users to collaborate on data across the organization more efficiently and getting you to data driven insights faster.

So we're going to take a look at a few examples. And everything i'm talking about today is currently available in preview. We had some offering capabilities uh available previous to the day. But everything today you can access in preview with amazon quick site.

So the generative b i capabilities with quicksight are really focused into three areas. The first is our a i powered dashboard authoring experience. And what that means is we're supercharging business analysts to build dashboards more quickly and efficiently. It's also lowering the technical bar and the learning curve. So people who weren't traditionally, analysts can build dashboards.

The second area. And this was introduced on tuesday is the ability to create a i powered insights on demand to allow business users to be able to ask a question when it's convenient from them if it's 10 o'clock at night and they're prepping for a meeting at eight o'clock in the morning and they need answers, they can ask a question and get a multi visual response immediately, surfacing insights from the data.

And finally, we're going to talk about a i assisted data storytelling. Data stories, allows business users to discover and share findings, to persuade others to take action, think your traditional monthly or quarterly reports. And we're gonna talk about that in more detail.

We're gonna focus on each of these areas individually. So the first hat we're putting on is business analyst. So think uh dashboard developer, somebody who has to create the content. So with our generative b i capabilities for authoring experience, you can now create visuals using natural language. You can say show me sales by month, you're not having to drag and drop and make selections. You're not having to pick fields. Quicksight with the generative b i capabilities will build the visual for you.

The second piece. And my favorite part of the authoring experience is creating calculations. Authors can now describe the calculation they want to create and the syntax will be created for you. So it really lowers the bar of having to come over and learn new syntax with different tools. So it will create the calculation and finally the piece where you build the dashboard. But then before you really wanna share this dashboard, there's that fine tuning that takes place things like, you know, turning off zoom bars, maybe changing the chart type things that typically take 5610 mouse clicks to make it look really good. You can now edit visuals with natural language. You can say change the chart type to a donut, you can change the field types in a visual, you can change the different time granularity. These are all things that helps an author become more productive to get dashboards out and available.

We're going to dive a little bit deeper into this piece. But i'm gonna ask you, i know it's late in the day, but to kind of join me on the journey because i want to take you through a day in your life. What it really takes to build something. So i'm gonna ask you to put on your hat of analyst. So you're gonna be an analyst and a fictitious auto manufacturer. It's 430 in the afternoon, your managers come to you and said i have a planning meeting for next quarter at 10 o'clock in the morning. And i need a dashboard. I need to understand our sales performance metrics across multiple dimensions to try to find areas, to focus our campaigns for next quarter. Ok? It's 430. I have dinner plans. I don't want to leave this overnight. So i'm gonna try and use generative b i to expedite the dashboard creation.

So let's take a look here. We are within the quicksight interface. So if you're not familiar with quick site as an analyst, this is our analysis view. So we've got our fields on the left hand side, we've got our visual pane right next to that. And then we have our canvas where we build our dashboard. I'm gonna start by giving the sheet or tab within the dashboard a name. We'll call it a sales dashboard.

Then i'm gonna ask q quicksight q to build me a visual. I want sales by month kind of thinking about trends. When i ask it, it's gonna come back and say maybe you want a line chart, i can take a look and get key insights delivered to me. I can see my top movers, my bottom movers, my anomalies, i can even add a forecast to that line chart directly within the interface. I like this chart. I'm gonna go ahead and add that to my analysis.

So how are we looking at trend? We've got monthly trend and we've got a forecast. Did a little sizing. I'm gonna go out and ask quicksight to build another visual. This time, i wanna look at maybe my month over month percent sales difference. So i'm gonna ask for that. When i do that. Quicksight says you likely want a kp i, so i've got my kp i, i'm gonna add that to my dashboard. When i do that, i'm gonna do a little bit of navigation here. We'll size this down. We're gonna move this directly under our trend chart and just showing you some additional features within quicksight uh as i size this i'm gonna change the title. So it says month over month sales percent difference. And while we're doing that notice, i just have two points on this chart because i ask for month, over month, i've just got my last two months of data. The generative bi i applies filters as well. So i only ask for a month over month. So it gave me the last two months. But i have the option. Maybe i do want to see the whole trend. I can disable that filter and see that trend line. I could also turn off that trend line as well.

You can ask for specific visual types as a builder. I know i want to see a sankey diagram of sales comparing region and body style. So i can specifically ask for that and have that visual built for me. That visual would have taken me at least seven mouse clicks and three drags to get that created. I can do it in less than a few seconds by just asking the question doing a little sizing here and i've got a region, i've got body style. I've got a sales trend.

Something i don't have is country performance. Maybe i'm interested in my top five countries. So i'm gonna ask that question but notice what i didn't do is i didn't say top five countries by what, what metric do i want profit? Do i want sales? So it's gonna make a guess. Hey, maybe top five profits, what you want. But it was kind of vague. So we had that, did you mean section? So it offers me other suggestions. In this case, i wanted to use sales. So i made that selection. We'll size this down, move it to the top. And so we're, we're really making great progress with building our dashboard

I do wanna focus just on my data for this year. So I'm gonna go create one more visual and say build me a visual. Uh we're gonna look at sales by month uh and body style for 2023. So notice when I do that, uh you can see the interpretation below. So it did apply the 2023 filter and it gave me a table. I really don't want a table. I want a bar chart, I can very quickly as the builder change that chart type and then add that to my analysis.

So we've spent less than five minutes building out several visuals that were meet the needs of our manager, but we always want to exceed expectations. So let's talk about calculated fields. Maybe we want to add some new dimensions for them to look at in that sales planning meeting. In this case, we're gonna ask for create a vehicle type field. So in my data, I've got fuel type, I've got uh hybrid, I've got electric and I've got gas. So maybe I wanna just compare gas vehicles versus any type of ev so I can ask that question and just say, hey, if it's hybrid or electric, I want it to be ev all others are gas and it will create that if else calculation the syntax for me.

So now I have that vehicle type field to utilize within a visual. In this example, I'm just creating the visual manually to show you uh the kind of the quick site interface and the three panes. So here we're gonna pick our vehicle type and our order date and then we're gonna look at that with total sales and then I'm gonna change the aggregation type of order date to year. So now we're looking at our new vehicle type field and we can see our sales data but maybe we want more. We've got sales. But what about profit? What's our profit margin look like?

So I'm gonna create another calculated field and say calculate profit margin. In this case, I'm going to say use total sales and total profit because I know I have those fields in the data. I'm gonna do that and it's gonna create that profit margin calculation for me. Notice I had the opportunity to review it. I can choose to insert that into the expression builder and then save the calculation. I could also make modifications there as well.

So I'm gonna add the profit margin calculation to our table this point. I like the table, but maybe the pivot table would look a little better. So I'm gonna change my chart type to pivot and then we're going to change it, put our date field on the columns, created two calculated fields very, very quickly. I did not have to look up in the documentation to figure out how to build the calculation. I was able to use the new generative b i capabilities to create that. for me, we're gonna go out and create just one more quick calculation because we want to look at the month, over month percent difference. And this is a calculation or a question that we get a lot to looking at the different functions and and how do we do make create those calculations?

So we have a whole range of period over period functions. But instead of having to learn or look up what intakes it take, it takes to create that syntax, you can just say create the month over month calculation and it gives you that period over period percent difference calculation. So we'll say this calculation, we're gonna create one more quick visual. We're gonna create a line chart and we'll choose our order date on the x axis. We're gonna aggregate that up to a month. So we can use our month over month percent difference calculation as our value. And then we're gonna split that uh line chart by vehicle type. So we have a line for each vehicle type.

So our manager asked us to look at sales performance metrics within our data. In less than 10 minutes, we were able to do that plus at a number of calculations and some additional breakdowns. So now we've sized our dashboard here. It looks pretty good. Um as I'm looking back through, I've created visuals. I have created calculations, but I know this is getting shown to a to a large group, right? So is there anything I need to do to fine tune to make it more visually appealing?

Well, the first thing I see is that country total sales chart, I'm doing top five. I don't really like the bar chart. Maybe I want to make it a donut chart instead. So I can edit with q and say change this to a donut chart and the chart type has changed notice, you see a list, a link there that tells you what all you can do with the edit as well that you can explore uh other things though right now, we're showing two different monthly trend charts. We really don't need monthly trend times two and we're looking for a quarterly planning meeting. So maybe we want to change the level of granularity of this chart. So in this one, i can say hide the zoom and change the granularity to quarter. So we can show it by quarter.

So instead of having to make the six or seven mouse clicks to make those changes, i can immediately do that using natural language query all of this is available with the authoring portion of the generative b i capabilities. I just changed the title. So I'm looking pretty good. The only thing that looks a little bland, i'm really having to kind of hunt through and look at to understand is that pivot table, maybe i want to add some conditional formatting to that to really draw out the key areas.

So conditional formatting is one of the things you can use with the edit capability. So I'm gonna say make sales greater than 50 million dark green. You could use multiple facets, you could use it on profit margin as well. But very if you've used the conditional formatting functionality, you know, several steps to set up several mouse clicks, it's very easy to apply that with the generative bi.

So now I've got my dashboard created. I published that and then it's available to share to my business users. So again, it was 430 in the afternoon when my manager came to me and asked for this and in less than 15 minutes, we have our dashboard dashboard built to share. That's the productivity gains that the authoring features bring to you.

Ok? We're gonna switch hats. Now, we've built a dashboard, we've published a dashboard. Now we're gonna talk about business users, people who need to use those insights and what we can do. Two really exciting things that were announced on tuesday um are currently available in preview. These are for our business users is the ability to create executive summaries from dashboards.

So as a business user, i can create an executive summary that gives me a natural language summarization of key insights of a dashboard. It'll give you things like top and bottom movers, anomalies. Uh many things that's available in the insights all in a natural language query pane.

The other piece that's very exciting uh with amazon q within quicksight is the ability for a business user to ask a question or select from a list of suggested questions and get a multi visual response. So in a previous version of q, you got one single visual, now you get multiple visual for a more robust answer. So you can truly understand what's going on in your data. If you ask a vague question as a business user, like top five countries, if you're gonna get the, did you mean pain as well? So you can make some suggestions that will help guide you to that answer.

We also are going to provide insight summaries and within that pane so you can get you some context so you can more confidently understand the data. Those were the the 22 key components announced for business users on tuesday. And another key and very exciting ad for us from a generative b i capabilities for business users is the storytelling, the ability to create data stories as a business user.

So the new data stories functionality allows business users, an interactive visual way to create and share information and they can do this by just giving a natural language prompt. So with data stories, they're able to create that typical weekly monthly flow with the with the analyst head on the folks of you teams, we hear all the time from a use case perspective. We have a monthly meeting and in order to prepare for this monthly meeting, i have to go search through multiple dashboards, multiple reports typically end up taking either a screenshot or a screen g grab, get a gif image and then i push those out and then they get imported into some kind of slide deck or document or email.

In order to create the report, the data stories brings this all together in a single location. So with the the data stories, the a i can search for the insights and the findings, it will create the natural language context and then we allow you to share those with others. So we're really excited about the data stories and the amount of time it can save our business users. It also helps with the security and the governance because once you start taking snapshots of images, you lose all the security and governance with that.

So we're gonna take a look at building a data story and how this works. Again. This is available in preview today. So back with our uh car manufacturer. So now we're the the sales manager. So the dashboard has been delivered to me. So i have this dashboard. How do i want to share my insights with other users to get them to take action? I'm gonna create a story.

So with the data stories, with our published dashboard, the user is able to go say create story, you can create a scroll page or a slide show and then you provide the prompt. What kind of story do you want? I want a story about body style region and country total sales, highlighting key trends. You can choose what visuals to be considered for the story. It does not mean that they will be brought in automatically, but which ones do you want to be considered? And those can be across multiple dashboards.

Quick site will go out and build the story, it will have multiple sections, it will have a title. Uh you can do formatting on the title here. I'm just adding, you know, my name here, you see a number of different sections but you still have control as the business user. So here i'm seeing my total total quarterly sales trend. If i wanna add that monthly sales trend visual, that wasn't already incorporated. I can do that as well.

So as a business user, i can add different visuals, i can add a title to this as well and do some formatting. I as i scroll down, you can see i've got the sales by region and body styles. The san key chart came in with some key metrics. Uh my top five countries sales came in is in a paragraph style. Maybe i wanna change that to bullets. So these are the things as a business user, you can modify, you have control, but the insights are being generated for you.

I also have the ability to modify visuals. I could change the chart type. In this case, I'm going to take off the legend, make it look a little cleaner since i already have the labels on the doughnut chart itself. I i added the monthly sales trends up earlier in the report. So i'm going to delete that section. Uh i'm gonna add another visual in the body style section where it talks about the different uh suv demand. I i can do multiple things with the additional sections.

I'm gonna delete that utilizing insights across departments because it's not exactly relevant. And then i wanna change my conclusion to say next steps or call to action because i wanna make sure people who view this story know i'm asking them to do something you have full control over modifying the text. And then i'm gonna add that call to action at the bottom of my story.

We're going to do some formatting here once i say, hey, i want them to bring some ideas to our next meeting and our story is created now, it looks pretty good, right? But we do have control over the different styles as well. So we've got a few different styles available. I like the dark style myself, really kind of pops off the screen so you can apply those style sheets to your story as well before you share them, create your story and then you can share that story with other users.

We went through some of this quickly. But what i wanted to show you and i want to show it live. I mean, i built this uh for this particular presentation, it is truly a productivity gain. It is truly a point which generative b i where we can help our business users by giving them tools to make their lives easier and keep the governance and the security in place.

Yeah, with that, that wraps up my part of delivering the generative b i capabilities to business users, something we're really excited about. Again, we have authoring capabilities as well as the business user capabilities in preview today. And that kind of rounds out our presentation walking you through of how we're adding the generative a i capabilities across the analytic suite.

So, on behalf of ken and myself, we thank you for your time. We have

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值