Session
Lakeflow Observability: From UI Monitoring to Deep Analytics
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Data Engineering and Streaming |
Industry | Enterprise Technology |
Technologies | Databricks Workflows, DLT, LakeFlow |
Skill Level | Intermediate |
Monitoring data pipelines is key to reliability at scale. In this session, we’ll dive into the observability experience in Lakeflow, Databricks’ unified DE solution — from intuitive UI monitoring to advanced event analysis, cost observability, and custom dashboards.We’ll walk through the revamped UX for Lakeflow observability, showing how to:- Monitor runs + task states, dependencies + retry behavior in the UI- Set up alerts for job + pipeline outcomes + failures- Use pipeline + job system tables for historical insights- Explore run events + event logs for root cause analysis- Analyze metadata to understand + optimize pipeline spend- How to build custom dashboards using system tables to track performance data quality, freshness, SLAs + failure trends, and drive automated alerting based on real-time signalsWhether you're new to data engineering or building a platform-wide monitoring solution, this session will help you unlock full visibility into your data workflows.
Session Speakers
IMAGE COMING SOON
Saad Ansari
/Product Management
Databricks
IMAGE COMING SOON
Theresa Hammer
/Associate Product Manager
Databricks