Session
Orchestration with Lakeflow Jobs
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Data Engineering and Streaming |
Industry | Enterprise Technology |
Technologies | Databricks Workflows, LakeFlow |
Skill Level | Beginner |
Curious about orchestrating data pipelines on Databricks? Join us for an introduction to Lakeflow Jobs (formerly Databricks Workflows) — an easy-to-use orchestration service built into the Databricks Data Intelligence Platform. Lakeflow Jobs simplifies automating your data and AI workflows, from ETL pipelines to machine learning model training. In this beginner-friendly session, you'll learn how to:- Build and manage pipelines using a visual approach- Monitor workflows and rerun failures with repair runs- Automate tasks like publishing dashboards or ingesting data using Lakeflow Connect- Add smart triggers that respond to new files or table updates- Use built-in loops and conditions to reduce manual work and make workflows more dynamicWe’ll walk through common use cases, share demos, and offer tips to help you get started quickly. If you're new to orchestration or just getting started with Databricks, this session is for you.
Session Speakers
IMAGE COMING SOON
Saad Ansari
/Product Management
Databricks
IMAGE COMING SOON
Anthony Podgorsak
/Product Manager
Databricks