Salesforce Data Cloud Consultant Certification

Solution overview - 18%

Introducing Data Cloud

Connect

Import all relevant customer data at scale, from any app, device, or real-time stream with out-of-the-box connectors.

Harmonize

Organize all the connected customer data into a singular, standardized data model based on the Salesforce Customer 360 Platform.

Unify

With data in a single customer graph, anticipate customer needs and preferences with unified profiles that adapt to their activity in real time.

Analyze and Predict

Gain insights on unified customer data from powerful analytical tools like Tableau.

Activate

Unified customer profiles empower teams to create intelligent, automated experiences across the Customer 360. This unified data gives teams everything they need to know about customers as they interact with the business.

Data Cloud: Capabilities

LifeCycle

在这里插入图片描述

Connect

The first step in building a unified data model is to connect your data sources.

Connectors

Data Cloud connectors make the process of integrating your data from common sources fast and easy, without having to rely on a data integration team. The bundles below allow to you connect to each source system and retrieve the most commonly used data with clicks, not code:

  • Marketing Cloud Email Studio
  • MobileConnect
  • MobilePush
  • Marketing Cloud Data Extensions
  • Salesforce CRM
  • Marketing Cloud Personalization
  • Common Cloud Storage Providers
  • Commerce Cloud
APIs

In addition to the standard connectors, Data Cloud gives users the ability to connect data from any source with its Ingestion APIs.

The Streaming API can be used to send real-time event information to Data Cloud, such as engagement information from your website or mobile application. It uses a fire-and-forget pattern to synchronize micro-batches of updates between the source system and Data Cloud in near-real time. Data is processed asynchronously approximately every 15 minutes.

For systems without a pre-built connector, the Bulk API provides a way for data to be prepared in large batches of up to 150Mb at a time by external platforms and sent to Data Cloud for ingestion.

MuleSoft

The MuleSoft Anypoint platform contains dozens of pre-configured connectors for common platforms to easily and quickly transfer data from systems outside of Salesforce into Data Cloud.

Consume or activate data to any cloud and any application.

Build a trust-based, first-party data asset. Provide transparency and security for data gathered from individuals who provide consent for its use and receive value in exchange.

Harmonize

DATA MODELS
The Customer 360 Data Model

The Customer 360 Data Model is the standard Data Cloud data model that helps with the interoperability of data across applications.

Data Model Object (DMO)

Once data has been ingested into Data Cloud, it can be mapped to objects based on the Customer 360 data model. When a customer maps their data in this way, it creates a Data Model Object (DMO). The result is a normalized entity within the Customer 360 data model, like an Individual or a Sales Order.

Unify

One of the critical features of Data Cloud is Identity Resolution. Identity Resolution is the process of identifying and linking all the different data records that refer to the same real-world entity or object, such as a person. It’s an important task in data management and analysis, as it helps to ensure data quality, reduce redundancy, and improve integration.

Identity Resolution

At the heart of the system is the ability to resolve multiple identities into a single, unified profile. This profile is persistent, which means that it contains all the relevant information and attributes associated with it, and it’s continually updated over time.

Identity Matching

A key component of Identity Resolution is the concept of matching. When it comes to methods used to identify an individual, Data Cloud uses both deterministic (exact) matching, such as a matching email address, or probabilistic (fuzzy) matching, which uses machine learning and statistics to identify similar records with a high degree of probability of being matched. An example might be a person who goes by both Matthew and Matt, in different contexts.

Identity Reconcilication

Identity Reconciliation is the other key component of Identity Resolution. When two or more identities are matched, Identity Reconciliation dictates the rules by which duplicate attributes are chosen to represent that Unified Individual. Data Cloud allows you to customize these rules to make sure you’re referencing the right attributes about each individual.

Unified Customer Profiles

When two or more identities love each other very much they join up and create a unified profile! Just kidding, but not really. A Unified Profile is the product of Identity Resolution. It’s a complete and consistent view of a customer or entity that combines data from multiple sources. Why did the marketer create a Unified Profile of her cat? To better understand her purr-sonality!

Declarative Tools

With Data Cloud, powerful data manipulation and aggregation tools are available to use in an easy, click-based UI.

Cross-Device Identity Management

Your customers have a complex digital footprint that comes from multiple types of identities created from their interaction with various advertisements, social media platforms, and devices. Data Cloud keeps track of all these identities, and when it has enough information to match these identities together, it adds that information to that individual’s Unified Profile.

HouseHolding

Data Cloud has the capability to group together individuals that likely belong to members of the same household, family, or other groups like an organization. This is done by analyzing data—such as addresses, phone numbers, and other identifying information—to identify patterns and relationships that suggest multiple identities belong to the same group.

Analyze and Predict

The unified data model provided by Data Cloud builds a powerful foundation to enable businesses to use analytical tools to gain insights and make data-driven decisions.

How are insights gained?

Calculated insights

Data Cloud is capable of performing complex aggregations on unified data through click-based tools or using SQL.

Streaming insights

Aggregations are not only for bulk data. Real-time data streams can be calculated and aggregated to launch powerful actions to personalize every moment.

Tableau integration

A native integration with Tableau enriches business intelligence (BI) for driving deeper insights using the unified profile. In addition. the Java Database Connectivity (JDBC) driver allows for a connection to Data Cloud ANSI SQL API. This connection lets developers use their favorite BI tools to access and retrieve data.

Datorama integration

Reference the Data Cloud unified data model in Datorama to accelerate time to value in building analytical and predictive models for marketing and advertising campaigns.

Data provisioning

Analysts have made it clear that the single most differentiated feature of such a platform is its ability to provision data embedded with intelligence, create harmonized data packages provisioned for specific endpoints, and design for specific business personas. This is the core Data Cloud development approach.

Act

What’s the impact of segmentation and activation?

Ability to query data

At the heart of Data Cloud is its segmentation engine, which gives business users the ability to query all of the data in the system, create granular segments of customers, and understand their composition.

Attributes from other systems

Automatically use attributes from sales, service, commerce, loyalty, enterprise resource planning (ERP), and modeled data in an easy-to-use, declarative interface, and get immediate population results.

Easy activation

Activation is as easy as clicking a button to send segment data along for activation in messaging, advertising, personalization, and analytics systems.

Discover more about segmenting and activating data.

Comprehensive data

Access data from marketing, sales, service, commerce, data warehouses and lakes, and any source available on demand.

Smarter data

Use integrated, Einstein-calculated attributes to add modeled data, like propensity and Customer Lifetime Value (LTV) scores.

Immediate results

Run queries and use drag-and-drop segmentation tools to test and build audiences with incredible speed at an enterprise scale.

Activate anywhere

Activate data for advertising, marketing, and personalization through every Marketing Cloud product, and send it to external partners.

Data Cloud Use Cases

scenarios

Financial Services
  • Use Segments to identify major life events, such as graduation, first job, marriage, childbirth, divorce, retirement, or inheritance, to grow deposits and generate revenue.
  • Use Streaming Insights to detect possible fraudulent transactions and launch a real-time journey to notify customers to review any suspicious behavior.
Healthcare And Life Sciences
  • Integrate Data Cloud with multiple health telematics systems to calculate a Unified Health Score, or identify critical points for intervention in care.
  • Onboard healthcare providers with greater efficiency and reduced risk of churn with aggregated insights across multiple channels.
Retail And Consumer Goods
  • Unify known consumer profiles to identify key audiences for personalized ads and new audiences through look-alikes. Select audiences by top purchases, top-tier loyalty, and highly engaged customers.
  • Use real-time engagement data to identify moments that matter in a customer journey, from an abandoned cart to delivery notifications.
Media And Communications
  • Identify an optimal purchasing window for new devices and services by combining historical transactions and real-time behavioral data.
  • Turn customer service into an upsell channel by having a shared, Customer 360 view across subscription management and sales.

Data Subject Rights

A customer’s data subject rights, such as those granted to them by the European Union’s General Data Protection Regulation (GDPR), relate to the storage and usage of their personal data. Data Cloud offers tools to support your consumer’s data subject rights in alignment with various data protection and privacy regulations.

You can submit data subject rights requests for individual profiles in Data Cloud. If you’re using the Data Cloud Identity Resolution feature, submit requests for source individual profiles.

Data Subject Rights Request.

Submit Data Subject Rights requests using Individual ID as the identifying parameter. Requests must be submitted and processed separately in all connected Salesforce clouds, including Commerce Cloud.

EXAMPLE Northern Trail Outfitters (NTO) has a Unified Individual profile for Samantha Smith that consists of two Source Individual profiles. These Source Individual profiles are identified as Individual ID 1234 and 5678. Samantha requests that Northern Trail Outfitters delete her personal data. When processing Samantha’s request, NTO submits a data deletion request to the Consent API using Individual ID 1234 and 5678 in the relevant API parameters.
在这里插入图片描述

Obtain an Individual ID

You can obtain Individual Ids through three different methods: Segmentation, Source System, and Query API.

  • Via Segmentation
    You can retrieve an Individual ID by creating a segment using any available identifying information as segment filter criteria. This information includes email address, phone number, or first and last name. You can activate your segment to Amazon S3. The published activation file includes the Individual IDs for the segment population.

  • Via Source System
    When data is mapped to the Individual entity, you must map your imported data customer identifiers to the Individual ID field. This Individual ID field reflects the value from your Data Stream source. You can retrieve Individual IDs by consulting the source systems for your Data Streams.

  • Via Query API
    You can retrieve an Individual ID by submitting a request to the Query API. Submit requests to the Query API using email address, phone number, or first and last name.

Requesting Data Deletion or Right to Be Forgotten

You can submit Data Deletion requests of individual data profiles in Data Cloud. All requests must be submitted using the Consent API.

A Data Deletion request deletes the specified individual record from the Individual DMO and the related DMOs.

Data Deletion requests are reprocessed at 30, 60, and 90 days to ensure a full deletion. You must submit any data deletion requests in all connected systems and Salesforce clouds.

Restriction of Processing requests in Data Cloud

You can submit Restriction of Processing requests for Individual and Unified Individual profiles in Data Cloud. All requests must be submitted using the Consent API.

Restriction of Processing requests restrict all data processing for the specified Individual and Unified Individual profiles within 24 hours.

Carefully consider consumer rights and expectations when building segments based on objects other than Individual and Unified Individual.

Submit any Restriction of Processing requests in all connected systems and Salesforce clouds.

Requesting Data Access and Export

You can submit Data Access and Export requests for Individual profiles in Data Cloud. Submit all requests using the Consent API.

Data Access and Export requests triggers the export of all data stored within Data Cloud for the specified Individual profiles. This export is published as a CSV file to your defined Amazon S3 bucket within 15 days.

Checking Status of Data Subject Rights Requests.

You can check the status of your Data Subject Rights requests for Individual profiles in Data Cloud using the Consent API.

Impact of Data Modeling on Data Subject Rights Requests.

Map all personal data stored in Data Cloud to the Individual entity to ensure that it’s processed as part of any Data Subject Rights request.

EXAMPLE Northern Trail Outfitters ingests sales order data from Sales Cloud and stores it in the Sales Order entity in Data Cloud. This sales order data includes a field for customer email address, which is considered personal data. Northern Trail Outfitters creates a relationship in the object data model between a custom Customer Id attribute on the Sales Order entity and the Individual Id field.

Consent Management

Many data protection and privacy regulations require you and your company to honor a consumer’s requests for how you contact them and how you use their personal data. Data Cloud provides multiple methods for you to ingest and store your consumer’s consent preferences.

Ingesting Consent Preferences

Ingest consent preferences from wherever you’re storing them using connectors. Common storage locations include data extensions (Marketing Cloud Connector), Sales or Service Cloud standard or custom objects (Salesforce CRM Connector), or external sources (Cloud Storage Connector).

TIP We recommend that consent preferences be mapped to objects from the Privacy Data Model but they can be mapped to custom objects as well.

EXAMPLE Northern Trail Outfitters ingests a Marketing Cloud Data Extension that includes Communication Subscription Consent and Communication Subscription Consent. These objects allow Northern Trail Outfitters to determine which engagement channels and communication subscription topics they can use to contact their customers. When creating the Data Stream for this Data Extension, Northern Trail Outfitters maps the Consent Preferences object to Data Model Objects (DMOs) from the Privacy Data Model.

Configuring Data Model Preferences

When ingesting a data stream that includes consumer consent preferences fields, map the fields custom objects or attributes within the Data Cloud data model.

If you’re storing consent on standard or custom data model object (DMO), define the appropriate relationships between those objects and the Individual DMO.

If you’re storing consent on multiple DMOs (standard or custom), define the appropriate relationships between those objects.x

Using Consent Preferences in Segmentation

If you’re using custom DMOs or attributes to store consumer consent preferences in Data Cloud, use them as filter criteria when creating a segment.

For example, Northern Trail Outfitters created a segment of individuals who opted into email communication and haven’t opted out of promotions. They use the Consent Status attributes as filter criteria to ensure their segment honors individuals’ consent preferences.
在这里插入图片描述

Checking Consent Preferences at Activation

When activating an audience, this process publishes a segment to be found in a Salesforce cloud or external system.

We suggest checking for consent before using this data in any marketing platform.

Data Cloud Setup & Administration - 12%

Provisioning

Where Does Data Cloud Live?
  1. Remember Data Cloud data is not stored as sObjects, but in the Data Lake outside of Core CRM.
  2. Sharing Rules and other data restrictions in Core CRM do not apply to data stored in Data Cloud.
  3. Productized, API integrations with Marketing Cloud, Commerce Cloud, MuleSoft, and out-of-the-box (OOTB) Data Cloud org LWCs to help navigate data stored in Data Cloud.
Using an Existing Data Org

Using a New (Separate) Home Org

在这里插入图片描述

Setup For The First Time
  1. Set Up a Data Cloud Admin User
  2. Create Data Cloud profiles before creating Data Cloud users.
  3. Add or update Salesforce users to create Data Cloud users.
  4. Update permission set assignments to give users access to Data Cloud.
Permission Sets
  • Customer Data Platform Admin: Responsible for the setup of the application, user provisioning, and assigning permission sets within the system. This role has access to Salesforce Sales Cloud and Service Cloud, in addition to other integrated systems within the core cloud platform. The admin executes day-to-day configuration, support, maintenance, and improvement, and performs regular internal system audits.
  • Customer Data Platform Data Aware Specialist: Responsible for creating data streams, mapping data to the data model, creating identity resolution rulesets for unified profiles, and for creating calculated insights.
  • Customer Data Platform Marketing Manager: Responsible for the overall segmentation strategy, including creating activation targets, activations, and ‘Customer Data Platform Marketing Specialist’ permission.
  • Customer Data Platform Marketing Specialist: Responsible for creating segments in Customer Data Platform.

We recommend assigning users to these standard permission sets, since they can change with each release as new features become available in the Data Cloud. Cloning existing permissions to create custom permission sets could result in users not having access to features or functionality when standard permission sets change.

Connector Setup

Data Cloud has a set of pre-built connectors to the following:

  • Salesforce Clouds, such as CRM, Marketing Cloud, B2C Commerce, and Marketing Cloud -Personalization.
  • External sources, such as external file storage (Google Cloud Storage, Amazon S3).
  • API and mobile connectors, such as Ingestion API, Web, and Mobile SDK.

Each connector varies in terms of how the data can be ingested from a batch, real-time or near real-time perspective. It’s important for an admin to be aware of the different ingestion patterns for these connectors.

  • Batch - CRM Connector, Marketing Cloud can ingest and updates data hourly so would follow batch pattern.
  • Near Real-Time - Ingestion API processes small micro-batches of records every 15 minutes so could be considered near real-time.
  • Real-Time - Web & Mobile Connector can process engagement data every 2 minutes so would be a real-time ingestion pattern.

The Salesforce CRM connector enables access to the Salesforce CRM data, including but not limited to Sales Cloud, Service Cloud, B2B Commerce, and Loyalty Management.

  • Home org: This is the org where it’s installed. If the customer is using this org for Sales Cloud or Service Cloud or Loyalty Management, they may use the connector to ingest CRM data from within the Home org.
  • External orgs: These CRM orgs are external to the org where it’s installed. Customers may connect to any production external orgs, including other orgs where it may be installed.
  • Sandbox orgs: These are sandbox CRM orgs that are external to the org where it’s installed. Customers may connect to any sandbox external orgs.
CRM Support

Salesforce CRM supports 1:1, 1:M, and M:1 connections in the three ways.

Amazon S3 Connector
  • The Amazon S3 connector lets you ingest data from S3 buckets as well as activate data to S3.
  • The process for setting up S3 connections is different than other connectors. Rather than having an administrator configure the connector within Data Cloud’s setup, Amazon S3 connections for data ingestion are configured individually at the data stream level.
  • Connections can be made by any user with access to create a Data Source, such as a Data Aware Specialist. Additionally, it means that connection information must be provided each time a new Data Stream is created.

TIP:

  • Unmanaged packages support packaging AWS S3 data streams with relationships to both standard and custom data models.
  • Managed packages support packaging AWS S3 data stream with relationships to custom data models only.
Google Cloud Storage Connector
  • Once an admin configures the connection, it can then be used by other users (data-aware specialists or marketing managers) to ingest or activate the data (without needing to know the GCS credentials).
  • Ingest data using Google buckets: Land and ingest flat file data using Google Cloud storage.
  • Define Google buckets in setup: Register buckets in setup to simplify stream definition and management.
    • Credentials for all related data streams can be easily managed from a single location.
  • Google Cloud Storage: Limit and refresh schedules.
  • Five GCS connections per org are supported.
  • Data and files from GCS buckets are kept in sync hourly with Data Cloud infrastructure.

Administration

External Activation Platform
  • Use your single source of truth audience segments anywhere to drive targeted advertising, insights, and more.
  • Deliver audiences for web, mobile, and CTV-targeted campaigns using hashed PII, mobile devices, and OTT.
  • Any partner is enabled to create Data Cloud activation connectors via AppExchange or combine capabilities to create innovative solutions.
Unmanaged Package
  • Commonly used for one-time migration of metadata.
  • All components and attributes are editable.
  • Not upgradeable nor supported.
  • The developer has no control over the components once installed.
Managed Package
  • Typically used by ISVs on AppExchange to sell and distribute their apps.
  • Protects intellectual property of developer/ISV.
  • Is upgradeable and supports versioning.
  • To support upgrades, certain destructive changes are not allowed.
  • Contains a distinguishing namespace.
Data Kits

Data Kits offer a more efficient way to package Data Cloud. Create new Data Kits to bundle CRM data streams and data models with flexibility and ease. This experience makes deploying multiple CRM streams as easy as Data Cloud out-of-the-box bundles.

  • Data Streams
  • Data Models
  • Calculated Insights
Metadata API

With the Metadata API standard Salesforce API layer, you create, modify, or manage metadata objects. Parts of Data Cloud configuration objects are available in Metadata API. More details on Metadata API are found in the Metadata API developer guide.

  • AWS Data Streams
  • Ingestion API Data Stream
  • Mobile and Web Data Streams
  • Data Lakes
  • Data Models
Data Cloud Analytics

Data Cloud allows you to use various tools and products to analyze your data.

  • Tableau
  • CRM Analytics
  • Marketing Intelligence

TIP: It’s important to note the data that lives in Data Cloud itself is in the Data Lake so at this point, it’s not possible to use the Standard Salesforce Reports & Dashboards in CRM on top of this.

Administrator Reports

The following objects are currently supported in Lightning Report Builder.

  • Data Stream
  • Segment
  • Activation Target
  • Identity Resolution
Lightning Chart

For a visual overview of your data, add a report chart to give users a visual way to understand the data in your report. Lightning charts are also added to various parts of the customizable Lightning UI.

TIP: To create a report on a Data Cloud object, you need to configure a custom report type. Once the custom report type is created, it will become available in the Lightning Report Builder.

Workflow Orchestration
  • Workflow Orchestration enables Data Cloud Admins to define more granular, connected workflows with more flexible execution schedules.
  • Data Cloud actions are used in Salesforce Flow builder to create automated workflows. This leads to near-real-time execution in a sequence based on the customer’s need.
  • Available Flow Actions
    • Data Ingestion for CRM datastream.
    • Data Ingestion for S3 datastream.
    • Publish Calculated Insight.
    • Trigger Identity Resolution Job.
    • Publish Segments, materialize segments, and activate.
  • Error Notification. The following objects are currently supported in Flow Builder:
    • Identity Resolution
    • Calculated Insights
    • Data Streams
    • Segments
    • Activations
Unified Individual Record Pages

With Lightning App Builder, admins customize and organize their Data Cloud Unified Individual Record Pages to quickly surface insights most relevant to their teams.

  • Customize and organize the Record Detail page for your Data Cloud Unified Individuals. This includes ordering and hiding specific fields, and UI components.
  • The App Builder page for the Data Cloud record supports customization with other Salesforce Out-of-Box Lightning Components or custom components created by customers.
  • The access to Profile Explorer and Unified Individual is managed by a Permission Set in the Salesforce Platform Setup.
Profile Explorer Record

The Profile Explorer record page within Data Cloud can now be customized with App Builder, so fields on this page can be re-ordered or added/dropped, etc.

  • Data Cloud Detail Panel
  • Data Cloud Highlights Panel
  • Data Cloud Profile Related Records
  • Data Cloud Profile Engagements
  • Data Cloud Profile Insights
Contact Record

The Contact record page within Data Cloud can now be customized with the following components.

  • Data Cloud Profile Engagements
  • Data Cloud Profile Insights
  • Data Cloud Profile Related Records
Troubleshooting

Use Data Explorer within Data Cloud to view and validate the data that exists in your Data Model Objects (DMO), Data Lake Objects (DLO), and Calculated Insights (CI). You use this to ensure your data, formula and other transformations are accurate.

TIP: Data Explorer only displays a maximum of 100 records. Use filters to control the records being returned and displayed in Data Explorer.

When publishing your segment, a segment membership DMO is automatically created to store information about the members of the segment.

Troubleshoot Segment Errors

在这里插入图片描述

Troubleshoot Activation Errors

在这里插入图片描述

Sharing Rules

Sharing Rules are used to extend sharing access to users in public groups, roles, or territories for your Data Cloud Objects.

  • Data Streams
  • Calculated Insights
  • Segments
  • Activations
  • Activation Targets

Data Ingestion and Modeling - 20%

Data Ingestion

Normalized Data

Normalized data is divided into multiple tables, with established relationships to reduce redundancy and inconsistency.

Denormalized Data

Denormalized data is combined into a single table to make data retrieval faster. The rows contain relational data as column attributes. This format is commonly known as a spreadsheet view.

Configure the Data Stream

This diagram outlines the steps you typically take when configuring the data stream from a data source to ingest data.
在这里插入图片描述

Select the Data Source

Connected sources identified during the set-up process, MuleSoft, and cloud-based options like Amazon S3 or Google Cloud Platform.

Select the Data Source Object

Data Stream operates at a single object or file level.

  • In Marketing Cloud, this might be a data extension.
  • In Salesforce CRM, it might be a contact or case object.
  • In B2C Commerce, it might be a Sales Order Customer Entity or a Sales Order Entity.
  • With Amazon S3 or Google Cloud Platform options, you need to specify details of the file that will be ingested.
  • If the file is located in the root directory of the S3 bucket, the Directory attribute can be left blank; otherwise, you’ll need to specify the directory.
  • The directory specification value should always start with the name (without prefixes) and end with the character (/).
  • The file can also be compressed with Zip and GZ compression standards, but the archive can only contain a single file within it.
Define the Data Stream Properties

This section focuses on two key fields when defining the data stream properties:

  • Source Name: This identifies the source system.
  • Category: This determines how data is used.

The data source category plays an important part in how the data is used within the Data Cloud data model. Let’s review each of the categories in a bit more detail.

Profile Data

Use the Profile Category for data sources that provide information about individuals with their identifiers, demographic and profile attributes, and contact points (such as email, phone, or postal address).
在这里插入图片描述

Engagement Data Category

Use the Engagement Category for data sources that provide time-series data points. Examples include customer transactions, engagement activities, like opening and clicking through emails, and web browsing histories.
在这里插入图片描述

In practice, engagement data provides input for calculated insights
and is used in segmentation criteria to assess an individual’s
behaviors, affinities, and propensity over a period of time.

Once you choose the Engagement Data category, you’re required to specify which field from the data set correlates to the Event Date.

Be careful!
This field selection can not be edited post-Data Stream
set-up. If you choose the wrong attribute, you have to delete and
reconfigure the data stream to correct the selection.

Immutable Value

  • Records whose value cannot be changed after the object is created.
  • Immutable Values are essential for maintaining data relevancy and validity.
Other Data Category

Lastly, use the Other Data category for all other data sources, including engagement data with mutable date fields.
在这里插入图片描述

Confirm the Data Source Object

Once the data source is selected, the Data Cloud platform evaluates the data set and presents a list of fields with their suggested data types for the Data Source Object (DSO), similar to the image below.
在这里插入图片描述

Date Type

Data Cloud supports the following data types: Text, Number, Date, and DateTime. While the platform is intelligent and recognizes most of the data types appropriately, pay special attention to the date formats.

Always carefully review suggested data types for all of the
attributes, because these can’t be changed after the creation of the
data stream.

Primary Key

his value uniquely identifies a given record within the data set and establishes whether a new record from the data source should be added to the DSO or if an existing one should be updated.

Record Modified Field
  • This attribute acts as a reference point when the system is deciding whether to update the record continuously, calibrating the latest version of the record.
  • It’s also useful when data might be received out of order, helping prevent overwriting of the information with the older version.
Organization Unit Identitfier

If your data set includes an attribute that provides a reference to an organization unit, such as Marketing Cloud business unit ID (MID), you can specify that attribute in the Organization Unit Identifier configuration field of the data stream.

Apply the Necessary Row-Level Transformations

  • What happens when a data source doesn’t include a field with a unique value that can be used as a primary key?
  • What if the uniqueness of the record needs to be determined via the Composite Key?

Composite Key
The value that is produced by combining values from more
than one field together.

In these situations, Data Cloud provides the ability to create formula fields. Use these to improve and enrich source data.

Formula fields can include a combination of hard-coded literal values, or they can derive and calculate values using formula functions.

It’s important to know that formulas work at the row level. This means
that for any given record processed, the formula context only enables
access to the fields of that single record.

use cases
Primary Keys and Missing Attributes

Create attributes needed for the ingestion or mapping of source data.

  • If source data is missing a primary key, or a composite key is needed using a concatenation of literal values with one or more attributes from the data source
  • Creating an event date time for an engagement data set that doesn’t have a set date and doesn’t need to be updated
Normalization

Simplify segmentation and improve usability by bucketing or grouping source data values.

  • Converting the loyalty points balance into tiers, sport interests into categories, city and postal code into a region, and range of values into Boolean sets
Standardization

Ensure consistent, clean data values and formatting for segmentation and activation.

  • If the data source contains fields with mixed case values, or includes special characters that need to be removed

It’s important to know that the UUID() function can’t be used as a
means to generate primary key values in a majority of use cases.

Configure Updates to the Data Source Object

The last step in the configuration process is to define how the data is written and refreshed in the Data Source Object (DSO).

Refresh Mode
  • A data stream can be scheduled to refresh hourly, daily, weekly, or monthly.
  • It can also be configured with a None option, providing the opportunity to ingest the data manually.
Starter Data Bundles

The Starter Data Bundles provide access to sales, service, and loyalty data, enabling highly personalized messaging experiences for specific customer segments. The bundle deploys data streams and configures mapping to the Data Cloud data model.

Sales Cloud Bundle

The Sales Cloud bundle includes the following objects:

  • Account
  • Contact
  • Lead
Service Cloud Bundle

The Service Cloud bundle includes the following objects:

  • Account
  • Case
  • Contact
Loyalty Cloud Bundle

TIP
Keep in mind that access to the objects and fields, including some standard objects like the Case object, must be explicitly granted via the Salesforce Data Cloud Salesforce Connector Integration permission set. Failing to do this results in an Insufficient Permissions error message.

Direct Object Ingestion

在这里插入图片描述
After selecting the All Objects option, the dialog displays the list of all objects visible to the connector.

  • Find the specific objects you need using either the manual scroll and select or the quick search functionality.
  • Once located, only one object can be selected and configured for deployment at a time.
  • Next, the Schema Review dialog box allows you to specify the Category for the object, although it automatically defaults to Profile.
  1. The Primary Key is pre-populated and set to be read-only.
  2. Selection/de-selection of the fields for each respective object
  3. Update of Field Label and Field API Name while preventing any modifications to their data types
  4. Creation of the formula fields
  5. At the deployment step, the dialog allows for a data stream name update, defaulting its value to Object Name_Salesforce Org Id.

The incremental (ongoing) refresh setting automatically defaults to Hourly since this is the only refresh frequency available at this stage. However, the data stream is fully refreshed weekly.

Once the data stream is deployed, be sure to map it to the Data Cloud data model, since this method does not provide an automated mapping.

Data Kits for CRM Data Streams

Data kits provide a new way to bundle CRM data streams and data models with flexibility and ease. As a result, data kits make deploying multiple CRM data streams as easy as when using starter data bundles!
在这里插入图片描述

Data Modeling

Explore the details of the data modeling process in Data Cloud, including mapping ingested data into the standard model, extending the model to satisfy segmentation and activation requirements, and defining the relationships between data model objects.

Glossary
DSO

Data Source Object. Object that underpins the data stream

DMO

Data Model Object. Cloud data model that consolidates data of the same nature from numerous data sources through the data lake objects

DLO

Data Lake Object. Target destination for records from the data streams.

Customer 360 Data Model

Cloud Information Model (CIM), the Customer 360 Data Model is the foundation of the Data Cloud standard data model.

MDM

Master Data Management. Master Data Management refers to the system that provides data stewardship and governance across the enterprise.

Modeling

Data Modeling is the second step in establishing the data model in our Data Cloud platform. Once the data is ingested in its raw form with optional transformations, you advance to modeling it.

The model is a consistent, semantic view on top of the data.

Harmonization

Harmonization is the process of mapping the ingested data in alignment with the Customer 360 Data Model.

  • Marketers work with harmonized data abstracted from raw source data schemas.
  • This means they have a common understanding of data regardless of the source of origin, so they can draw insights from that data for marketing purposes.
Customer 360 Data Model

The Customer 360 Data Model is Data Cloud’s standard data model. It helps with the interoperability of data across applications.
在这里插入图片描述

When should you choose the standard or custom data model?

在这里插入图片描述
在这里插入图片描述

You can extend the standard data model with custom attributes, objects, and relationships for a hybrid approach.

Data Transformations

在这里插入图片描述

在这里插入图片描述

Modeling Process

在这里插入图片描述

Inventory Data

Ideally, complete this step ahead of the data ingestion setup process. The purpose of this step is to create an inventory of all data streams ingested into Data Cloud.

Inspect Field-Level Data
  • Establish what data identifies individuals uniquely and ideally across the entire dataset from all sources. This data might be the CRM record ID or MDM ID.
  • Identify the relationship between records in source data, and establish whether or not all of them contain required primary keys that are unique at the data model object level.

For records or objects missing that level of uniqueness, it highlights the need for using the formula field to create fully qualified primary keys upon ingestion.

The data that comes to Data Cloud originates from multiple sources and varies in formats. It’s important to assess the arrangement of the records to understand how these will be mapped to the Data Cloud data model. There are two main arrangements to consider—denormalized and normalized.

Demormalized Data

在这里插入图片描述

Normalized Data

在这里插入图片描述

Why Is This Important?
1.The Data Cloud standard data model is normalized, meaning that data needs to be normalized before it can be mapped within Data Cloud.
2. Not all source systems provide normalized export options or normalize data to the same degree.
3. Therefore, due to your assessment of the field-level data, establish how the source data needs to be transformed and mapped into the Data Cloud data model.

Configure Mapping

Once the inventory of data is complete, the field-level data is identified and examined, and the mappings to the Data Cloud data model are designed. We can then proceed with the actual mapping process. There are a few things to keep in mind when mapping data.

Data Category

Data Model Objects (DMO) don’t have a first-class concept of category. Instead, a DMO inherits its category from the first data source object mapped to it. After the data model object inherits a category, only data source objects with that same category can map to it.

Extending the Data Model

If, during the assessment of the data, there are some gaps identified in the standard data model, you can extend the standard objects by adding new custom attributes to accommodate your requirements. In addition to fields, you can also create custom data model objects from the Data Model tab in the application using one of the available methods.

From Existing

When creating a DMO using the “From Existing” object, the schema for the new object first replicates the existing object and the user is presented with a dialog to adjust the final schema. The dialog enables the removal of the initially configured fields and allows for additional fields to be added.

From File

The option to create a custom object using “From File” requires the schema to be developed and documented in a CSV file, with these four columns in this exact order: “Label, DeveloperName, Type, IsPrimaryKey.”

New

The “New Custom DMO” option requires the specification of the object to be developed from scratch, and you’re expected to configure all fields within the UI.

Another Way

在这里插入图片描述

Required Mappings

Certain data mappings are required for the Data Cloud application to drive value across harmonization, unification, and activation. This diagram outlines these main objects and the key attributes that data sources need to map to.
在这里插入图片描述

Configure Relationships

Once the mapping process is complete, take a look at the relationships between data model objects. It’s important to note that relationships are, in fact, configured on the data model not between the data source objects, as the end-users interact with the data model.

Data Explorer

An administrator or specialist inspects the data in data source objects, data model objects, and calculated insights objects using Data Explorer to support the validation of the configured data objects.

Choose the Object Type

在这里插入图片描述

Configure the Columns

Note that the UI only displays up to 100 records and at most ten attributes (columns). The columns displayed in the results can be configured via the Edit Columns button.

Filter The Records

The records can be filtered for display with the use of the Filter List control. This control lets you identify specific records or look at various combinations of attributes.

Inspect the Records

For the data source objects, it’s important to be able to inspect initial values and validate any formula fields that have been calculated during ingestion.

For data model objects, this feature enables validation of the configured mappings and lets you sample records using predefined IDs. Once you get to the calculated insights, you can also inspect calculated values to validate the accuracy of aggregated data and formulas.

Identity Resolution - 14%

Identity Resolution consists of two parts: identity resolution rulesets and profile reconciliation.

Currently, Data Cloud supports up to two Identity Resolution configurations, allowing an element of A/B testing to be applied once an initial ruleset is established and configured. Note that each of the result sets will produce independent unified profiles, resulting in an increased count of total known profiles and therefore impacting allocated platform utilization for the org.

And once the profiles are matched, reconciliation rules tell Data Cloud how the various attributes need to be represented in the final unified profile.

Unified Profile

Identity Resolution is used to consolidate data from different sources into a comprehensive view of your customer, which is called a unified profile.

  • All contact points associated with the individual and complete lineage are retained in Data Cloud.
  • All metrics and all behavior associated with the individual records combined and independently available.
  • Consent for all touch-points of the user is tracked and can be used depending on the channel.

Identity Resolution Implementation Process

在这里插入图片描述

Profile Data Across Data Sources

In the Data Modeling course, we discussed the use of Data Explorer as a means to inspect and validate both ingested and modeled data in Data Cloud. Combining that approach and adding some Calculated Insights (these will be discussed in another course), it’s recommended to profile data across multiple data sources.

Use Calculated Insights to achieve the following outcomes:

  • Summarize data.
  • Attempt to establish uniqueness of the identifiers, whether or not contact points are shared across individuals.
  • Identify how many contact points are coming from any given source.
  • Inspect whether or not the address (if used as a contact point) was correctly imported.
  • Verify any potential mapping or relationship issues that might lead to incorrect unification results.

Configure Match Rules

The match rules establish criteria for relating source Individual records to each other with the intent to produce a Unified Individual record. The rules can make use of standard and custom attributes mapped to the data model.

Configure Reconciliation Rules

Reconciliation rules establish criteria for “picking up a winner” for the Unified Individual and Unified Contact Point records attributes when there’s a clash.

在这里插入图片描述

Validate Results

Once you publish the Identity Resolution and processing completes, validate the results. Start with the initial evaluation of the resolution summary; does the data look directionally correct?

Unified Link

To assist with the validation efforts, make use of the additional DMOs and relationships that are automatically created to keep unified records and bridge tables that maintain links between original Individual, Party Identification, and Contact Point Channel records and their unified versions. The diagram below outlines these objects and relationships for the configuration, with Email being the only contact point mapped in the data model.
在这里插入图片描述

Review

At the completion of the Identity Resolution process, once you’re happy with the results, plan to make a periodic re-evaluation of the data. Use the same Calculated Insights or add more to assist with ongoing review of the summarized and aggregated data, establish rules for inspection of the unified profiles, and potentially consider A/B testing to further improve the matching ruleset.

Monitoring Identity Resolution

在这里插入图片描述
The Ruleset Status indicates the status of a given Identity Resolution configuration record. The key status is Published as it indicates the currently active ruleset. The Last Job Status indicates the current (latest) run status, with In Progress indicating that Identity Resolution is being re-evaluated and Success indicating completed resolution without errors.

Note that the Identity Resolution process runs periodically after initial publishing, and as a user you don’t have control to define the specific time of day when this process starts.

Refer to the Processing History tab on the Identity Resolution record details page to establish the cadence and average duration of the job when planning for the time of your segmentation and activation processes.

Party Identification Matching

Party Identification attributes let you use external identity graphs, including but not limited to Master Data Management (MDM), license numbers, mobile identifiers, and anonymous profiles via first-party cookie ID.
在这里插入图片描述

The key here is to understand that the identity resolution will match two records that for the same Party Identification Type and Identification Name share the Identification Number value.

The remaining three fields are the most significant, as they directly impact the matching during the unification process.

在这里插入图片描述

When configuring match rule criteria to use Party Identification, the Party Identification Name is used to specify the name of the identity source or space. The Party Identification Type is optional in this configuration, although using it provides an additional level of organizing the identity.

Handling of Anonymous Profiles

It’s worth mentioning that Data Cloud provides the means to ingest anonymous profiles in addition to known profiles. This allows for scenarios where initial data collection occurs for individuals who did not identify themselves with the brand in any way.

在这里插入图片描述

To categorize the Individual object record as anonymous, be sure to map the Is Anonymous field for the respective contributing data source.

Once at least one known Individual profile is matched with an anonymous record, that record will be marked as known going forward.

Segmentation and Insights - 18%

Segmentation

Use segmentation to break down your data into useful segments to understand, target, and analyze your customers. Create segments on any entities from your data model, and then publish them on a chosen schedule or as needed.

在这里插入图片描述

Create a segment in Salesforce Data Cloud to publish data to activation targets. Create your segment with basic properties, such as name and description.

Create a Segment

Segments are created by completing the fields in the new segment window and specifying your criteria in the Segmentation Canvas.

Segment Canvas

On the Segment Canvas in Salesforce Data Cloud, use direct and related attributes to narrow down a created segment to your target audience.
在这里插入图片描述

Attribute Library

The Attribute Library in the Segmentation Canvas displays the direct (1:1) and related (1:Many) segmented target data that has been mapped into the Data Cloud Data Model and marked for use in segmentation.

Each segment can have up to 50 attributes, and each attribute can have up to 10 nested operators.

The maximum number of segments for a Salesforce org is 9,950.

Rule Builder

Within the Segmentation Canvas, build filters to define your target audience using your 1:1 and 1:Many data, and features like on-the-fly aggregates, filter frequency, relative date expressions, and nested operators.

Rule Builder also allows you to use Calculated Insights that have been created in your segment criteria.

Welcome diversions. The most rewarding adventures often start with an unexpected detour.

Container Basics

Containers provide a way to create relationships between your related attributes. Attributes within a container act on the same data row in the data model table. When attributes are placed in one container, the query engine looks for attributes that relate to one another in this way. The Attribute Library displays objects up to five relationships away from the container.
在这里插入图片描述

  • If you place “yellow” and “scarf” in the same Order Product container using AND, the query engine looks for a customer who purchased a “yellow scarf” as a single product on the purchase.
  • If you place “yellow” and “scarf” in separate containers, the query engine looks for customers who purchased any yellow product and also purchased a scarf of any color.

Container Path

Some containers can have more than one data relationship back to the segmentation entity (“Segment On”). Select one to help Data Cloud understand how to build your segment.

  • An order for a used car has a buyer ID and seller ID relating back to an individual.
  • When using order attributes such as filter criteria, choosing a path lets the user decide which group of individuals is used to build their segment—Car Buyers or Car Sellers.

Count Segment

Count Segment allows you to request a count of the segment targets that are in your segment, based on your current data ingested and defined segment filters.

  • Upon creation, a segment displays the entire Data Cloud segment targets in this tenant as the count.
  • As filters are added, updated, or removed, the marketer can request the count again.
  • Frequent and easy validation of the count gives the marketer confidence that the segment reflects the target audience they’re trying to create. A count that’s greatly off could be an indication of a problem with the mapped data model, data loaded, or the filter definition.
  • Counts are also used as a budgeting or planning exercise by the group responsible for messaging, such as, “I need to plan for a campaign that will message around 200,000 individuals.”
    在这里插入图片描述

The Segment Count shows the overall count of members who fall in a specific segment. To see the individual-level details on the records within a specific segment, it needs to be published and activated.

Publish Segment

Publish your segment, either ad hoc or on schedule, so it’s available in activation targets like Marketing Cloud and AWS S3 Bucket.

  • To use the segment for messaging, it needs to be published to one or more activation targets.
  • Publish reruns the count first and then creates a materialized segment. Activation targets are notified that a new segment exists to use.
  • Segment publishes can be:
    • Ad hoc—Customers who have an unscheduled immediate campaign, one-time campaign, or want to manually decide when to use the segment. You would select the “Don’t refresh” Publish option.
    • Scheduled—Customers who want to “Set it and forget it” for ongoing campaigns. The Publish schedule available currently is every 12 hours or 24 hours.

If you delete or deactivate a segment, there’s no functionality to re-enable it.

If you deleted (or deactivated) in error, you must recreate the segment.

If you plan to use the segment again, stop the publish schedule instead of deleting the segment.

Segment Exclusion

Apart from including members in your segment, you can also set up exclusions that allow you to explicitly exclude members from being in your segments. The segmentation canvas lets you intuitively access and create exclusion filters within the same interface.
在这里插入图片描述
在这里插入图片描述

Calculated Insights in Segments

Segmentation also lets you use Calculated Insights that have been created in your segmentation criteria.
在这里插入图片描述

  • Integrate Attributes
    Empower marketers to focus on data-driven marketing initiatives and campaigns by integrating computed attributes between segmentation and analytics.
  • Simplify the Rules
    Making sense of large-scale behavioral data isn’t easy. Simplify the complex rules a marketer would have to create in segmentation with standard yet robust views that are reusable.
  • Reduce Time
    Reduce the time needed for counts and publish when standard computed attributes are used across segments. This means calculating once in order to reference many times.

The Segment On entity must be a profile when using Calculated Insights in segments.

For insights to appear in segments, the table that you segment on must be added to the query as a JOIN

The primary key of the segmented table must also be a dimension in your created insight

The Segment Container can only include one metric

The Segment Container can include multiple dimensions as filter criteria for a metric

Streaming insights aren’t currently supported in segments.

Identity Resolution

Segmenting population counts let you to compare the impact of different Identity Resolution rulesets for the same entity.

After using Identity Resolution to create different rulesets for the same entity, entities and attributes created by both rulesets are available in segmentation.

You can use attributes from both rulesets to validate and test different population counts

Value Suggestion

Enable Value Suggestion for mapped text attributes. Attributes that have been enabled with Enable Value Suggestion in the Data Model allow you to search data values along with type ahead functionality to surface ingested values for your attributes.
在这里插入图片描述

  • Only text attributes can be enabled for value suggestions.
  • Value suggestion can be enabled for up to 500 attributes in your entire org.
  • It can take up to 24 hours for suggested values to be visible after being enabled.
  • For attributes with more than 1,000 values, the most frequently occurring 1,000 values in the dataset are displayed alphabetically. Some operators allow you to select multiple values for an attribute.
  • Values with more than 255 characters aren’t available as suggested values, but you can still type them in to filter on them.

Segment Time Zone

The Data Cloud org sets the publish time zones on the initial segment save and segment canvas. Updates made to your Salesforce org’s time zone aren’t reflected in existing segments.

  • Newly created segments follow the updated time zone.
  • To update the time zone of existing segments, resave your segments.

Nested Segments

This lets you use another, existing segment when building a segment to build upon the existing rules, rather than having to recreate them.

  • Simplify segment creation for common elements.
  • Encapsulate the segment for reuse, which is efficient and allows for organizational consistency.
    在这里插入图片描述

Viewing Segment Membership Data

When publishing your segment, a segment membership data model object (DMO) is automatically created to store information about the members of the segment.

  • Check test data in the segment that you created to ensure the membership is correct.
  • Obtain deeper insights on the member composition of your segments.
  • Identify which profiles have entered and exited a segment over specific periods of time.
  • See segment membership as a related attribute when querying one or more specific profiles.
    在这里插入图片描述

Each entity you build a segment on has a set of membership DMOs across all segments using it.
在这里插入图片描述
在这里插入图片描述

Best Practices

  • Preparation
    • Plan your use case.
    • Remember—planning is everything!
  • Data Availability
    • Ensure the data you need is available.
    • Work with your data-aware specialist.
    • Understand the data model.
    • Make sure that the data modeler works with the marketing or segmenter ahead of time.
  • Data Cleansing
    • Cleanse and prepare your data for segmentation before you use that data in an audience segment or for personalizing content.
    • Use customer Identity Resolution to help mitigate any duplicate contacts in your data.
  • Data Validation
    • Take a step-by-step approach.
    • Request counts for every filter.
    • Use Data Explorer to explore and validate your input data to segment on.
  • Publishing
    • Find the best time to publish in relation to other segment publishes.
    • Review helpful fields, such as Last Published End, Last Publish Status Date Time, Next Publish Date Time, and Publish Schedule.

Insights

The Insights feature within Data Cloud lets you define and calculate multi-dimensional metrics from your entire digital data state in Data Cloud.

Metrics

Metrics are quantitative measurements of aggregated data from Data Cloud that are used to evaluate something.

These metrics can include factors such as Customer Lifetime Value (LTV), Recency Frequency Monetary (RFM), Most Viewed Categories, and Customer Satisfaction Score (CSAT).

Calculated Insights

Define and calculate multidimensional metrics from your entire digital state stored in Data Cloud.

Streaming Insights

streaming data coming from real-time data sources to use real-time insights. Drive more value with insights by enabling cross-functional orchestration.

Calculated Insights

Calculated Insights lets you enhance your data by extracting additional insights about your customers.

  • Build multi-dimensional metrics: Define multi-dimensional metrics—such as LTV, CSAT, RFM, and others—on the entire digital state stored in Data Cloud.
  • Supercharge your segmentation: Use these insights within Segment Builder to gain a deeper understanding of your customer.
  • Activate for personalization: Personalize your marketing activations.
    在这里插入图片描述

Marketers use Calculated Insights to define segment criteria and personalization attributes for activation using metrics, dimensions, and filters. This feature is natively available for profile-level insights.

The benefit of Calculated Insights is the ability to:

  • Compute a complex attribute.
  • Supercharge segmentation, such as RFM or LTV scores across different objects.
  • Compute outside of segmentation, which abstracts away the aggregations and calculations.
  • Use that in segmentation.
    在这里插入图片描述

Streaming Insights

Streaming Insights lets you create metrics on streaming data coming from real-time data sources to use real-time insights. Drive more value with insights by enabling cross-functional orchestration.

The streaming insight expression interface is very similar to calculated insights, except you don’t have the option to use existing insights. And you need to account for a window of time like in this example.

SELECT COUNT( RealTimeMobileEvents__dlm.pageviews__c ) as page_views__c,
ssot__Individual__dlm.ssot__Id__c as customer_id__c,
RealTimeMobileEvents__dlm.product__c as product__c,
WINDOW.START as start__c,
WINDOW.END as end__c
FROM
RealTimeMobileEvents__dlm
JOIN
ssot__Individual__dlm
ON
ssot__Individual__dlm.ssot__Id__c = RealTimeMobileEvents__dlm.deviceId__c
GROUP BY
window( RealTimeMobileEvents__dlm.dateTime__c ,'5 MINUTE'),
customer_id__c

How Do Streaming Insights Work?

The way to use a Streaming Insight is through a Data Action. When an Insight is obtained, a Data Action is created that you can act upon. Data Actions make your Streaming Insights usable and actionable.
在这里插入图片描述

Collect the Insights

Streaming Insights can continuously produce sophisticated insights on events collected from streaming sources such as:

  • Website or mobile clickstreams
  • Internet of Things (IoT) signals
  • Database event streams
  • Financial transactions
  • Social media feeds
  • Customer profile updates
  • Location-tracking events
Analyze the Insights

In Data Cloud, streaming data and insights are collected in real-time at high volume. Data Action rules ensure customers get highly curated and useful output and outstream to use in other applications, which drives meaningful outcomes such as:

  • Generating time-series analytics on continuously moving data.
  • Leading users to find useful patterns and share the insights with other apps with Data Actions.
Send Data Actions

A Data Action rule triggers an action with an appropriate payload when certain conditions are met.
The current Data Action Targets supported in Data Cloud are:

  • Webhook: Send Data Actions to any webhook target and protect the message integrity with the Salesforce-generated secret key.
  • Salesforce Platform Event: Send Data Actions to the core event bus, which enables building flow applications based on near real-time insights generated in Data Cloud.
  • Marketing Cloud: Manage event and insight-driven scenarios with Marketing Cloud messaging and journey capabilities.

What Are the Benefits of Streaming Insights?

  • Streaming Insights and Data Actions
    • Process Event Streams: Build Insights on near real-time data streams, such as web and mobile SDK.
    • Define Data Actions: Create business rules and trigger useful actions.
    • Drive Automation (Flow and Webhook): Build deep integration with the Salesforce Platform (Platform Event Bus Flow). Webhook support enables first-party and third-party ecosystems to enable automation.

Streaming Insights act on streaming data, which is currently supported via the following connectors (Web SDK, Mobile SDK, and Interaction Studio).

The aggregation time window for a streaming insight can be a minimum of 5 minutes and a maximum of 24 hours.

Streaming Insights vs. Calculated Insights

You can use both Calculated and Streaming Insights in different ways to make the most of your data. Use the comparison table to decide which one is best for you.
在这里插入图片描述

Insights vs. Formulas vs. Segmentation Operators

At ingestion time, use formulas to perform operations on row-based data used downstream.
At segmentation time, audiences are created with segment operators.
With Calculated Insights, use views to make sense of large-scale behavioral data, and reuse views to enhance segmentation.

Formulas

Formulas are great for calculations and operations on different data types and are ideal to use when you can use a simple, row-based operation to abstract logic that the user can consume and use at segmentation time.

Calculated Insights (CIs)

Calculated Insights make it easy for users to define segment criteria and personalization attributes for activation using metrics, dimensions, and filters. Calculated Insights are best used for:

  • Non-trivial calculations—for example, calculating NPS as a percentage.
  • Complex queries across multiple objects.
  • Reusability purposes—for example, when you expect to use a view multiple times across different values, or when you’re combining related attributes in the same view.
Segmentation Operators

Use operators to complete self-service filtering use cases without the help of a data analyst. Segmentation operators are a good choice for:

  • Simple aggregations, like count, on one object.
  • Maximum, minimum, average, and sum aggregations on numbers when the conditions are simple and unlikely to be reused often.
  • Filter on two years or less of aggregation data.

When segmentation operators don’t fully support your use case, use Calculated Insights.

Use Calculated Insights for maximum, minimum, average, and sum aggregations on numbers reused among segments or have a long list of other attributes as part of the filter container.

Use Case

在这里插入图片描述

Metrics on Metrics

The Metrics on Metrics feature lets you create Calculated Insights on Calculated Insights.

Organize complex Calculated Insights SQL data into logical steps and stitch powerful workflows. Metrics on Metrics also allows you to improve metrics and reuse them for multiple use case scenarios.

  • It allows an output of one Calculated Insight to be the input of another Calculated Insight.
  • Use any previous Calculated Insight in Metrics on Metrics.
  • Organize complex Calculated Insights SQL data into logical steps and stitch powerful workflows and improve reuse.
  • Build comprehensive insights, such as engagement scores, customer health scores, and customer attribution scores.
  • Metrics on Metrics supports three levels of hierarchy.

Managing Insights

The Insights home page tab lets you view the details and expression of all created insights. Your status, last run time, and last run status fields all update in real-time.
在这里插入图片描述
Use the dropdown arrow next to your Calculated Insight to make various changes to its content or status. You can also complete these actions from the desired Insights record page.
在这里插入图片描述

The tab within Data Cloud is called Calculated Insights, but you can see both types of Insights here.

  • Edit: Use the Edit function when you want to update the logic and attributes of an existing Calculated Insight.
  • Clone: Use the Clone feature to duplicate and edit the SQL function from this record.
  • Enable or Disable - Use the Disable feature to turn off processing for a calculated insight. Use the Enable feature to reverse this action.
  • Publish Now: Execute a Calculated Insight immediately in Data Cloud. Validate the results of a Calculated Insight as needed instead of processing it as a batch schedule and waiting hours for it to execute. You can submit three executions per day.
  • Show in Builder: This opens up the Insight in the Visual Builder (only for Insights created via the Visual Insights Builder).
  • Delete: Use the delete feature to permanently remove a Calculated Insight and its connections.

Guidelines and Limits for Editing a Calculated Insight

When editing a calculated insight in Data Cloud, consider these guidelines and limits.

ITEM GUIDELINES AND LIMITS

  • Measures
    • You can add a measure to an existing calculated insight.
    • You can’t change the API name, data type, or rollup behavior.
    • You can add only aggregatable measures to an aggregatable calculated insight.
    • You can add any measure to a non-aggregatable calculated insight.
    • You can’t remove existing measures.
  • Dimensions
    • You can’t change the name and data type.
    • You can’t add dimensions to a calculated insight unless it’s a key qualifier dimension. A key qualifier dimension is created by using the key qualifier field of an existing calculated insight’s dimension. For example, SELECT Individual.KQ_Id__c as kqid__c, in which Individual.KQ_Id__c is the key qualifier field and Individual.KQ_Id__c is the dimension of the calculated insight.
    • You can’t update a non-transformed dimension. A non-transformed dimension is created directly from a DMO attribute. For example, Select SalesOrder.soldToCustomer__c as soldToCustomer__c or Select Individual.Id__c as id__c.
    • You can update a transformed dimension if you don’t change its name or data type. A transformed dimension has a transformation function. For example, Select day(SalesOrder.PurchaseDate__c) as purchaseDate__c.
    • You can’t remove existing dimensions.
  • Segmentation or Activation
    • You can edit a calculated insight used in segmentation or activation.
  • Filters
    • You can edit a filter of a calculated insight.
  • JOIN
    • You can update JOIN conditions.

For customers who have both Data Cloud and CRM Analytics, view your Insights data in Analytics right from the Calculated Insight screen.
在这里插入图片描述

Data Cloud: Creating and Authoring Insights

Visual Insights Builder

Visual Insight Builder is a no-code user-friendly insight authoring tool. There’s no need to use SQL. Using Visual Insights Builder, you can generate required metrics and insights similar to those generated with SQL.

  • Aggregate. The aggregate function performs a calculation on a set of values in Data Cloud and returns a single value.
  • Case. Create a logic statement that narrows the result based on specific criteria.
  • Filters. Define filters on the measures and dimensions to reduce the results based on when a condition is met.
  • Transform. Change or clean data based on the fields in your selected data object.
  • Arithmetic Expression. Write arithmetic calculations to create insights used in segmentation filters.

在这里插入图片描述

  • You can save the Insight as a Draft or ‘Save and Run’ it instantaneously.
  • Insights created with the Builder can be viewed either in Builder or as SQL.
  • Insights can be edited to add additional items, but there are certain guidelines in terms of what’s allowed. For example, you can add measures and filters, but you can’t add or remove dimensions to an existing insight.
SQL Builder

Calculated Insight SQL Builder lets you use the full power of SQL to create your insights. You can create both Calculated Insights and Streaming Insights via SQL.
在这里插入图片描述

  • Measures and Dimensions must end with ___c.
  • Measure must be an aggregate function.
  • There must be at least one Measure.
  • Only numeric Measures are supported. Non-numeric Measures like Max(Date) won’t work.
  • In Data Cloud, timestamps are stored in Coordinated Universal Time (UTC), which includes CI Insights.
  • Insights are refreshed one time a day to multiple times a day, depending on the volume of data and complexity of queries.
  • You can execute a Calculated Insight Immediately, so that you can validate the results of a calculated insight as needed instead of processing it as a batch schedule.

You can submit three executions per day per Calculated Insight.
在这里插入图片描述
Streaming Insights requires you to take several steps before setting up the Insights.
在这里插入图片描述
The image below offers tips for creating Calculated Insights using SQL.

在这里插入图片描述

Insights Tips
Calculated Insight in Segmentation
  • The Segment On entity must be a profile when using Calculated Insights in Segments.
  • In order for Insights to appear in Segments, the table that you segment on must be added to the query as a JOIN. The primary key of the segmented table must also be a dimension in your created Insight.
Calculated Insight in Activation

You can only activate Calculated Insights metrics (not dimensions). A workaround option that could be used to activate dimension data is by using “FIRST,’ ‘MAX,’ or a similar function which makes that column a metric.

Note that when using this method, the Calculated Insight should be coded in a way so that the value returned as a measure aligns to the logic being used in the Group By or Where clause.

在这里插入图片描述

Act - 18%

The magic of Data Cloud is in creating experiences that wow customers. Data Cloud offers many ways for users to act on data. It lets users create streaming insights that can trigger data actions to a variety of locations or targets. The Data Action feature allows events, streaming insights, and data changes to trigger flows or events in a variety of locations. For example, an automotive company uses a data action to trigger an email to a customer in Marketing Cloud when their vehicle crosses the 10,000-mile mark.

Data Cloud for Marketers lets customers create audience segments for personalized marketing campaigns within Journey Builder. Segments can also be activated to a rich ecosystem of advertising partners, including Facebook (Meta) and Google. Beyond marketing, Data Cloud data can help create experiences in Commerce Cloud, Marketing Cloud Personalization, and more.

  • Segment Activation, which materializes and publishes your segment to activation platforms.
  • Data Actions, which act on your streaming data and insights to trigger actions based on certain conditions, which can enable downstream systems to drive an action or orchestration.

For administrators, there are differences between these two processes and where they apply. At a high level, activations on segments are Batch Activations, whereas Data Actions act on real-time streaming data.

Activation of Segments

Activation is the process that materializes and publishes your segment to activation platforms.
在这里插入图片描述

  • Activation is as easy as clicking a button to send segment data for activation in messaging, advertising, personalization, and external systems.
  • You can publish your segments, including contact points to your activation targets.

Activation Process

  • Create Activation Targets. Create a connection to different targets, including Marketing Cloud, Cloud File Storage, Data Cloud , and external Activation Platforms.
  • Create an Activation on a Segment. Begin the process of activating a segment that publishes audiences to Activation Targets for marketing campaigns.
  • Select Activation Membership. Select a different entity to activate on than what is segmented on to send to the activation target.
  • Select Contact Points. Select from the Data Cloud which fields and objects to include in an activation.
  • Add Additional Attributes. Push additional attributes to the activation targets for journey decisions, message, or content personalization.
  • Publish the Segment. After saving the activation, view a history of activations to track the statuses.

Activation Targets

An activation target stores authentication and authorization information for a given activation platform. Publish your segments and include contact points and additional attributes to the activation target platforms.

Create an activation target to these platforms:

  • Cloud File Storage Activation Target
  • Marketing Cloud Activation Target
  • Data Cloud Activation Target
  • External Activation Platform Activation Target

An activation target is automatically created for each Marketing Cloud Personalization account and B2C Commerce instance connected to Data Cloud. It doesn’t need to be created separately.

Cloud File Storage Activation Target

The Cloud Storage Activation Target lets you publish segments from Data Cloud to AWS S3. Before creating an S3 activation target, please determine your S3 access key and secret key

  • Activation to S3 can be made without mapping contact points.
  • Data Cloud supports Amazon S3-managed keys (SSE-S3).
  • After you create and activate segments to Cloud File Storage, a subfolder called Salesforce-c360-Segments is automatically created.

Segments in S3 are created in YYYY/MM/DD/HH/{first 100 characters of segment name}/{20 characters of activation name}_{timestamp in yyyyMMddHHmmsssSSS format}.

Marketing Cloud Activation Target

Publish your segments directly from Data Cloud to Marketing Cloud business units. Marketing Cloud Activations let you activate across your messaging channels, including email, SMS, and Mobile Push.

After you create and activate segments to Marketing Cloud, they show up in Contact Builder as a Shared Sendable Data Extension. A Data Cloud Segments subfolder is automatically created when you publish your first activation.

  • Segments are created as a shared data extension with this naming format: {first 52 characters of segment name}{16 characters of activation_name}{32 characters of Alphanumeric Random number}.
  • Activated segments shared by more than one Business Unit (BU) are created as a shared data extension in the Data Cloud subfolder.
  • The shared data extension for the activation target is based on the selected BUs.

The audience refreshes can be based on a marketer-controlled publish schedule on the segment.

As part of the setup, you can select any number or combination of Marketing Cloud business units for activation independent of the BUs selected for ingestion.

Business Unit (BU) Aware Activation: Data Cloud also supports Marketing Cloud BU Aware activations. More content is coming in the future. Monitor the Partner Community updates in Slack and Chatter.

Data Cloud Activation Target

Activate segments to the Data Cloud so that Salesforce core apps and non-core apps can query for segment membership, Calculated Insights, and attributes through connect API or Query API.

Data Cloud activation creates a Data Model Object (also called a curated DMO), which is then available in Data Cloud (both in the UI and the API).

The apps can query for segment membership and use that in a variety of use cases, such as loyalty, using segment membership for loyalty points accrual.

在这里插入图片描述

  • One Data Cloud activation target is created for your org.
  • Segments are found in Data Cloud via the Query API or directly via Data.
  • For Data Cloud, three attributes are persisted into the curated DMO by Default - {Segment ID}, {Segment ID Name}, and {Segment ID Last Processed}.
External Activation Platform Target

The External Activation Platform lets you create and define activation platform metadata that can be packaged and listed on AppExchange.

In addition, you can directly activate to Google and Meta, which lets you activate personalized advertising at scale using native integrations.

This allows you to securely join first-party data in Data Cloud together with premium advertising partners to extend reach to new channels and engage audiences using data from across the full customer relationship, all while protecting consumer privacy.

External Activation Platform creation or packaging is only supported in Namespaced Developer Editions.

Data Actions

Data Actions allow you to act on your streaming data and streaming insights. Streaming data could include data from your web and Mobile SDK connector or Marketing Cloud Personalization Connector.

Data Actions ‘Activate’ your streaming data in real time.

在这里插入图片描述

The supported data action targets are Salesforce Platform Event, Webhook, and Marketing Cloud.

  • Orchestrate Salesforce CRM workflows with insights and data events from Data Cloud.
  • Integrate data actions in Mulesoft Anypoint by sharing real-time aggregated event data with external partners based on criteria.
  • Integrate with SaaS applications with real-time signals from Data Cloud.
  • Trigger serverless functions that work with a webhook based on insights in Data Cloud.
  • Connect multicloud workflows or services when useful events happen in Data Cloud.
  • Push unfiltered insights and engagements to your data lake for near real-time analysis and storage.
    在这里插入图片描述

Activation Creation

Activation is the process that materializes and publishes a segment to activation platforms. The following three steps, showcased in the “Overview” section of the course, identify the activities for creating an activation on a segment.

Activation Membership

When creating the activation for a segment, select an entity from the Activation Membership line.

  • Activation Membership expands the possibility of activating the profile entities that have a 1:Many relationship with the segment entity.
  • Activation membership can be either the Segmented On entity or the entities with a 1:Many relationship with the Segmented On entity. In essence, you can specify a different object for your Activation Membership other than what it’s segmented on.

Use Activation Membership to add fields from different Data Model Objects (DMOs) than those your segment is built on to send to your activation target.

Unified Individual vs. Individual (in Activation)

To take advantage of Identity Resolution, use the Unified Individual as your Activation Membership.

Identity Resolution allows you to combine data from disparate data sources to create one Unified Individual.

Data Cloud activates one row for a Unified Individual. Each Unified Individual includes an email subscriber key, one contact email address, one contact phone number, one phone country code, and the additional attributes selected when creating the activation.

Contact Points and Source Priority Order

Begin by selecting the contact points, then edit the Source Priority order for each contact point.

  • The Source Priority order can be changed by editing. Use the Reorder function to reprioritize the source.
  • Delete Any Source and Any Type from the Source Priority order to only use values from specific sources in your activation.
  • However, expect the population count of your activation to be lower since you’re selecting values from fewer data sources.

Contact Point selections determine which objects and fields are included in an activation. Contact points are selected based on the Activation Target.

Additional Attributes

Attribute Library lets you add additional attributes to your activation. You can add the following attribute types to your activation:

  • Attributes of the Activation Membership entity.
  • Attributes from entities mapped with a direct relationship to the Activation Membership entity.

When you include additional attributes in an activation, you can give the attribute a Preferred Attribute Name for that activation. You can add up to 100 additional attributes for an activation.

Apart from direct attributes, Data Cloud also allows you to activate one-to-many related attributes (such as products purchased or claims processed) during activation. This unlocks additional use cases and expands personalization capabilities in your messaging and journeys and other activations, like showing a list of the top five products in your messaging journeys.

Calculated Insights in Activation

Use Data Cloud Calculated Insights to enable activation journey decisions and message personalization.

Add your Calculated Insights (CI) metrics (measures) onto any new or existing activation. You can also add dimension filters to your CI metric for more granular insights on an activation.

NOTE: You cannot add a dimension field from your CI as an additional attribute.

在这里插入图片描述

Managing and Reviewing Activations

You can manage your existing activations in the Activation tab in Data Cloud. Activities include editing and deleting an existing activation.
在这里插入图片描述

Best Practices and Tips

Marketing Cloud Activations

  • Profile Unification and Best Practices

    • Individual ID (Subscriber Key)
    • Contact Point Email Address (when Email Channel is selected)
    • Contact Point Phone (if SMS Channel is selected)
    • Contact Point Phone lLocale (when SMS channel is selected)
    • Any additional attributes selected at activation
  • Individual ID Mapping. For Marketing Cloud activation, Individual ID maps to the Subscriber Key on the Sendable DE Relationship.

  • Individual Multiple Sources. For Unified Profiles with Individuals from multiple source systems, Data Cloud always defers to Individuals originally sourced from Marketing Cloud.

  • Individuals not from MC. For Unified Profiles that have no Individuals sourced from Marketing Cloud, new records will be introduced to Marketing Cloud in the Unified Audience Activation.

Data Cloud also supports Marketing Cloud BU aware activations.

Consent Best Practices

We recommend that you follow these practices to manage consent when using unified Data Cloud audiences in Journey Builder.

  • Apply Filter Contacts Creteria. When using Journey Builder, consider applying the Filter Contacts criteria to limit who’s included in the audience when creating the entry source.
  • Assign a Plublication or Suppression List. When configuring an email activity to use on a Data Cloud audience, consider assigning a Publication or Suppression list for more consent management options.
  • Select this Send To Delivery Option
    • When you configure an SMS activity in Journey Builder to use on a Data Cloud audience, select Send only to contacts who are subscribed currently in the Delivery Options configuration step.
    • This selection helps prevent sending to individuals who didn’t opt in to receive the communication.
  • Use a Pulti-Step Journey
    • For push messaging using Data Cloud audiences, use a multi-step journey. You can configure either a Push Notification, Inbox, or In-App Message activity to activate a MobilePush send.
    • Consent for push messaging is managed in the MobilePush SDK (Software Development Kit).

Troubleshooting

在这里插入图片描述

For Marketing Cloud Activations, Data Cloud enforces the presence of contact points (email address, phone number).

Data Cloud Activations

  • The primary key of the curated Data Model Object (DMO) is the same field as the primary key of Activate on Entity.
  • When a segment is refreshed multiple times in a curated DMO, the same row is updated if the primary key already exists.
  • Each time you publish a segment to a curated DMO, records are added or updated to the DMO based on the primary key. It doesn’t delete existing records from the DMO. The following attributes persist in the curated DMO:
    • Segment {segmentid}
    • Segment {segmentid} Name
    • Segment Last Processed
  • Use the Last Processed field (date) to retrieve records, applying filter criteria if needed.

Amazon S3 Activations

  • After you create and activate segments to Cloud File Storage, a subfolder called Salesforce-c360-Segments is automatically created.
  • Segments in Amazon S3 are created in YYYY/MM/DD/HH/{first 100 characters of segment name}/{20 characters of activation name}_{timestamp in yyyyMMddHHmmsssSSS format} folder.
  • Within the folder, you will find two files
    • segment_metadata JSON file (segment_metadata.json)
      • Includes metadata about the segment including the segment name, record count.
    • CSV file of everybody who is in the segment

For Amazon S3 activations, Data Cloud does not enforce the presence of contact points, like email addresses or phone numbers.

Consuming Data Cloud Data in Core CRM

Data Cloud Data has the potential to support many use-cases across the Salesforce Platform.

  • Search - Search for a Profile using Contact Point Information (e.g. Email, Phone).
  • Display - Surface details about a specific Unified Profile related to an Individual.
  • Engagement - Show depth of activity through aggregation or summary for one Unified Profile (i.e. Case History).
  • 16
    点赞
  • 26
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
【项目资源】:包含前端、后端、移动开发、操作系统、人工智能、物联网、信息化管理、数据库、硬件开发、大数据、课程资源、音视频、网站开发等各种技术项目的源码。包括STM32、ESP8266、PHP、QT、Linux、iOS、C++、Java、MATLAB、python、web、C#、EDA、proteus、RTOS等项目的源码。 【项目质量】:所有源码都经过严格测试,可以直接运行。功能在确认正常工作后才上传。 【适用人群】:适用于希望学习不同技术领域的小白或进阶学习者。可作为毕设项目、课程设计、大作业、工程实训或初期项目立项。 【附加价值】:项目具有较高的学习借鉴价值,也可直接拿来修改复刻。对于有一定基础或热衷于研究的人来说,可以在这些基础代码上进行修改和扩展,实现其他功能。 【沟通交流】:有任何使用上的问题,欢迎随时与博主沟通,博主会及时解答。鼓励下载和使用,并欢迎大家互相学习,共同进步。【项目资源】:包含前端、后端、移动开发、操作系统、人工智能、物联网、信息化管理、数据库、硬件开发、大数据、课程资源、音视频、网站开发等各种技术项目的源码。包括STM32、ESP8266、PHP、QT、Linux、iOS、C++、Java、MATLAB、python、web、C#、EDA、proteus、RTOS等项目的源码。 【项目质量】:所有源码都经过严格测试,可以直接运行。功能在确认正常工作后才上传。 【适用人群】:适用于希望学习不同技术领域的小白或进阶学习者。可作为毕设项目、课程设计、大作业、工程实训或初期项目立项。 【附加价值】:项目具有较高的学习借鉴价值,也可直接拿来修改复刻。对于有一定基础或热衷于研究的人来说,可以在这些基础代码上进行修改和扩展,实现其他功能。 【沟通交流】:有任何使用上的问题,欢迎随时与博主沟通,博主会及时解答。鼓励下载和使用,并欢迎大家互相学习,共同进步。【项目资源】:包含前端、后端、移动开发、操作系统、人工智能、物联网、信息化管理、数据库、硬件开发、大数据、课程资源、音视频、网站开发等各种技术项目的源码。包括STM32、ESP8266、PHP、QT、Linux、iOS、C++、Java、MATLAB、python、web、C#、EDA、proteus、RTOS等项目的源码。 【项目质量】:所有源码都经过严格测试,可以直接运行。功能在确认正常工作后才上传。 【适用人群】:适用于希望学习不同技术领域的小白或进阶学习者。可作为毕设项目、课程设计、大作业、工程实训或初期项目立项。 【附加价值】:项目具有较高的学习借鉴价值,也可直接拿来修改复刻。对于有一定基础或热衷于研究的人来说,可以在这些基础代码上进行修改和扩展,实现其他功能。 【沟通交流】:有任何使用上的问题,欢迎随时与博主沟通,博主会及时解答。鼓励下载和使用,并欢迎大家互相学习,共同进步。【项目资源】:包含前端、后端、移动开发、操作系统、人工智能、物联网、信息化管理、数据库、硬件开发、大数据、课程资源、音视频、网站开发等各种技术项目的源码。包括STM32、ESP8266、PHP、QT、Linux、iOS、C++、Java、MATLAB、python、web、C#、EDA、proteus、RTOS等项目的源码。 【项目质量】:所有源码都经过严格测试,可以直接运行。功能在确认正常工作后才上传。 【适用人群】:适用于希望学习不同技术领域的小白或进阶学习者。可作为毕设项目、课程设计、大作业、工程实训或初期项目立项。 【附加价值】:项目具有较高的学习借鉴价值,也可直接拿来修改复刻。对于有一定基础或热衷于研究的人来说,可以在这些基础代码上进行修改和扩展,实现其他功能。 【沟通交流】:有任何使用上的问题,欢迎随时与博主沟通,博主会及时解答。鼓励下载和使用,并欢迎大家互相学习,共同进步。【项目资源】:包含前端、后端、移动开发、操作系统、人工智能、物联网、信息化管理、数据库、硬件开发、大数据、课程资源、音视频、网站开发等各种技术项目的源码。包括STM32、ESP8266、PHP、QT、Linux、iOS、C++、Java、MATLAB、python、web、C#、EDA、proteus、RTOS等项目的源码。 【项目质量】:所有源码都经过严格测试,可以直接运行。功能在确认正常工作后才上传。 【适用人群】:适用于希望学习不同技术领域的小白或进阶学习者。可作为毕设项目、课程设计、大作业、工程实训或初期项目立项。 【附加价值】:项目具有较高的学习借鉴价值,也可直接拿来修改复刻。对于有一定基础或热衷于研究的人来说,可以在这些基础代码上进行修改和扩展,实现其他功能。 【沟通交流】:有任何使用上的问题,欢迎随时与博主沟通,博主会及时解答。鼓励下载和使用,并欢迎大家互相学习,共同进步。【项目资源
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值