From 10 Hours to 10 Minutes: Unleashing the Power of DLT
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Data Engineering and Streaming |
Industry | Manufacturing |
Technologies | Delta Lake, Databricks SQL, DLT |
Skill Level | Intermediate |
Duration | 40 min |
How do you transform a data pipeline from sluggish 10-hour batch processing into a real-time powerhouse that delivers insights in just 10 minutes? This was the challenge we tackled at one of France's largest manufacturing companies, where data integration and analytics were mission-critical for supply chain optimization.
Power BI dashboards needed to refresh every 15 minutes. Our team struggled with legacy Azure Data Factory batch pipelines. These outdated processes couldn’t keep up, delaying insights and generating up to three daily incident tickets.
We identified DLTs and Databricks SQL as the game-changing solution to modernize our workflow, implement quality checks, and reduce processing times.In this session, we’ll dive into the key factors behind our success:
- Pipeline modernization with DLTs: improving scalability
- Data quality enforcement: clean, reliable datasets
- Seamless BI integration: Using Databricks SQL to power fast, efficient queries in Power BI
Session Speakers
IMAGE COMING SOON
Yash Joshi
/Senior Data Engineer
Accenture
IMAGE COMING SOON
Fatima CHIKH
/Data Engineer
Accenture