Salesforce and Databricks are excited to announce an expanded strategic partnership that delivers a powerful new integration - Salesforce Bring Your Own Model (BYOM) for Databricks. This collaboration enables data scientists and machine learning engineers to seamlessly leverage the best of both worlds: the robust customer data and business capabilities in Salesforce and the advanced analytics and AI capabilities of Databricks. With this integration, you can now build, train, and deploy custom AI models in Databricks and effortlessly integrate them into Salesforce to deliver intelligent and personalized customer experiences. Get ready to unlock the full potential of your data and revolutionize the way you engage with your customers.
Salesforce Data Cloud enables teams to collect data from diverse sources and expose them to downstream Salesforce CRM applications and workflows. Users can quickly and easily configure data objects to pull data from other Salesforce applications or external sources and map them to built-in data models or custom data models as needed.
Once the data is loaded and mapped, the Python SDK connector can be used to access the Salesforce Data Model Objects (DMO's) from within Databricks. Data from these objects can be combined with other data in your Databricks Lakehouse to power exploratory analysis and featurization to prepare the data for model training, using all the tools your data scientists and engineers know and love from the Databricks platform, including Notebooks, Feature Engineering, AutoML, and MLflow.
Once the model is trained and registered in Unity Catalog, your machine learning engineers can quickly and easily spin up a model serving endpoint in Databricks Model Serving. This endpoint can then be made available in Salesforce Data Cloud via the new integration: navigate to Einstein 1 Studio's Model Builder and register this new endpoint as a new Databricks Model. That's it! Now your model is ready to use in any Salesforce applications and workflows.
This powerful integration gives customers the ability to solve a broad variety of problems more quickly and easily than ever before.
These three core capabilities work together to give your data teams superpowers. The synergy created by having data and application teams able to quickly and easily collaborate using the best parts of each platform greatly enhances business agility and amplifies business impact by enabling AI powered optimization across the entire business.
Given the breadth of Salesforce applications and its ability to power sales, service, marketing and commerce applications, the number of use cases you can imagine with this are of course limitless. Likewise, since Databricks Lakehouse can enable management of data and AI assets across all of those functions and then all of your other business functions as well, those use cases can now benefit from unlimited data and AI power to truly transform your business for the AI economy. Here are a few examples:
To help get your models into production faster, please check out our Salesforce Databricks BYOM solution accelerator. The solution accelerator is a series of notebooks and instructions that walk through all of the steps needed to create a robust ML pipeline integrating data from Salesforce Data Cloud and the Databricks Lakehouse and then deploys it for integration into business processes in any of your Salesforce applications. The example it uses is based on a fictional company called Northern Trail Outfitters that wants to predict customers' product preferences to help deliver personalized recommendations. It uses Customer 360 data synthesized and stored in Data Cloud to drive the model building process in Databricks, and then shows how to refer to that model from Salesforce.
These are just a few examples of use cases that can help your business deliver value with data and AI. Get started today with our new BYOM solution accelerator!