Using AI Model Output in Data Cloud with Prediction Jobs

Salesforce Data Cloud puts AI into the hands of developers with easy-to-use tools. In an earlier blog post, How to Build a Predictive AI Model in Data Cloud, we described how a fictitious rescue center can use historical data to predict the likelihood of new animals being adopted. This blog post describes the steps needed to operationalize the predictive AI model to drive smarter business decisions.

The use case

In this post, we’ll use the predictive model created in the earlier blog post and build a prediction job that uses source data that we specify to create a predictive outcome for each record. The rescue center can then use the output to drive business outcomes in marketing campaigns and workflows, and provide easy access to applications using the Query API.

Creating the prediction job

Log in to Data Cloud and navigate to Einstein Studio
Open the Predicted Animal Adoption AI model, then click Integrations
Click New Prediction Job

Here, we can select our source data model object (DMO) with the information for new animals that need a prediction score generated. In our case, the rescue center ingested data corresponding to a new intake of animals, and the DMO is called Animal Nov 2024. This source object contains columns corresponding to those used to train the AI model.

Select the Animal Nov 2024 data model object

Click Next
Select the columns in the source data model object that map to each required input

Click Next
Now select how the prediction job runs. For this example, we’ll select Streaming

This will update the prediction when data in the input fields change. Note that you can also run the prediction job manually by selecting Batch.

You can optionally select which fields will update your prediction by selecting the checkbox next to each field.

Click Next
Review the summary and click Next
Give the prediction job a name and description

Prediction Job Name
Adoption Prediction Output

Prediction Description
Predictions for likelihood of adoption

Data Model Object Name
Adoption_Prediction_Output

Click Create

Now, your prediction job has been created and we’re ready to activate the job.

Activating the prediction job

By default, your new prediction job is inactive. To run the job, the record needs to be activated.

Navigate to the Integration tab and Activate the prediction job

The activation process takes some time, so wait until the status of the prediction job updates to Active.

Now, the prediction job is ready to be run.

Run the prediction job

Running the prediction job will take records from the source data model object and run each one through the predictive AI model to produce a score. The results of the prediction will be stored in a new data model object using the name provided when the job was created. In our case, the DMO is called Adoption_Prediction_Output__dlm.

Navigate to the Integration tab and Run the prediction job

The job may take some time to process, so wait until the Last Run Status field is updated to Success.

Review the output

Now that we have successfully run the prediction job, we’re ready to review the output.

Navigate to Data Explorer
In the Object dropdown, select Data Model Object → Adoption_Prediction_Output

Here, we can see that each Animal record in the source data now has a prediction assigned specified as a percentage. The higher the value, the more likely the chance of adoption. Since the output is a data model object, we can join it with the parent source to view details about each animal.

Navigate to Query Editor
In tab that appears, add the following SQL query that joins together the parent DMO (Animal_Nov_2024__dlm) with the results of the prediction (Adoption_Prediction_Output__dlm)

SELECT
animal.Name__c as ID,
animal.Type_c__c as Type,
animal.Breed_c__c as Breed,
animal.Gender_c__c as Gender,
animal.Primary_Colour_c__c as Color,
prediction.Adopted_c__c as Prediction
FROM
Animal_Nov_2024__dlm as animal
JOIN
Adoption_Prediction_Output__dlm as prediction ON prediction.PrimaryObjectPk__c = animal.Name__c

Click Run Query

The results of running the SQL is that now we can see the animal details alongside the prediction, showing clearly which animals have a higher and lower chance of adoption.

Take action on predictions

Where streaming prediction jobs come into their own is that they enable an organization to take action on predictive values as they change. For our rescue center use case, this means that the center’s staff is able to react in real time to animals being brought into the center who are at risk of not being adopted.

As data is added or updated in the source DMO, the streaming prediction job re-scores each animal and creates a new prediction on their likelihood of adoption. We can then use those data changes to trigger a business process in Salesforce.

To do that, we can create a Data Cloud triggered flow.

Create a flow

Open Setup in Salesforce and click Flows, then click New Flow
In the dialog that opens, select Start from Scratch, click Next, select Data Cloud-Triggered Flow, and click Create

Configuring the start element

Select Adoption_Prediction_Output as the object whose records trigger the flow, and set it to start when a record is created or updated.

Create a decision based on the prediction

Now, we can create a decision node that uses the predictive output to look for animals at risk. In our use case, we want to take action when the predicted chance of adoption falls below 20%. These animals need to be sent to a specialized team on Slack who can identify the best course of action to ensure that they find a forever home as fast as possible.

Surface insights for collaboration

In this post, we’re using Slack to surface key insights. To set up Salesforce for Slack integrations, you can follow the documentation. This is a prerequisite to being able to use Slack actions in Flow Builder.

Here, we’ve set up a business process that retrieves the animal details from Salesforce, and then uses the details to send a message to Slack. For full details on the steps needed, check out the How to use Data Cloud AI Model Predictions in Flow blog post.

Conclusion

Salesforce Data Cloud offers a versatile platform for operationalizing predictive AI models, empowering businesses to extract valuable insights from their data.

In this blog post, we demonstrated how a rescue center can leverage AI prediction jobs to predict the likelihood of animal adoption. This data can be leveraged in many ways to drive business benefits.

Data Cloud triggered flows: Trigger actions in Salesforce when data changing in your enterprise causes predictive outcomes to change
Targeted marketing: Create personalized marketing campaigns for animals with a lower adoption probability, reaching out to registered volunteers and website members
Batch data transforms: Incorporate predictive output into data transformations to filter and aggregate relevant records
Data graphs: Combine normalized table output from your predictive models to create new, materialized views of your data and make your predictions available in a single, flattened, read-only data graph record

Resources

Documentation: Use Jobs to Get Predictions
Documentation: Unlock the Power of AI with Einstein Studio
Trailhead: Build AI Models in Einstein Studio

About the author

Dave Norris is a Developer Advocate at Salesforce. He’s passionate about making technical subjects broadly accessible to a diverse audience. Dave has been with Salesforce for over a decade, has over 35 Salesforce and MuleSoft certifications, and became a Salesforce Certified Technical Architect in 2013.

The post Using AI Model Output in Data Cloud with Prediction Jobs appeared first on Salesforce Developers Blog.