MLOps That Ships: Accelerating AI Deployment at Vizient with Databricks
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Artificial Intelligence |
Industry | Energy and Utilities, Health and Life Sciences, Financial Services |
Technologies | MLFlow, AI/BI, Databricks Workflows |
Skill Level | Intermediate |
Duration | 40 min |
Deploying AI models efficiently and consistently is a challenge many organizations face. This session will explore how Vizient built a standardized MLOps stack using Databricks, Azure DevOps and GitHub Actions to streamline model development, deployment and monitoring.
Attendees will gain insights into how Databricks Asset Bundles were leveraged to create reproducible, scalable pipelines and how Infrastructure-as-Code principles accelerated onboarding for new AI projects.The talk will cover:
- End-to-end MLOps stack setup, ensuring efficiency and governance
- CI/CD pipeline architecture, automating model versioning and deployment
- Standardizing AI model repositories, reducing development and deployment time
- Lessons learned, including challenges and best practices
By the end of this session, participants will have a roadmap for implementing a scalable, reusable MLOps framework that enhances operational efficiency across AI initiatives.
Session Speakers
IMAGE COMING SOON
Ram Radhakrishnan
/Director - Technology, Data & Analytics
Vizient
IMAGE COMING SOON
Adam Hasham
/Lead Machine Learning Engineer
Vizient