Skip to main content

Build Data Pipelines with Delta Live Tables

In this course, you’ll learn how to define and schedule data pipelines that incrementally ingest and process data through multiple tables in the lakehouse using Delta Live Tables (DLT) in Spark SQL and Python. The course covers how to get started with DLT, how DLT tracks data dependencies in data pipelines, how to configure and run data pipelines using the Delta Live Tables UI, how to use Python or Spark SQL to define data pipelines that ingest and process data through multiple tables in the lakehouse using Auto Loader and DLT, how to use APPLY CHANGES INTO syntax to process Change Data Capture feeds, and how to review event logs and data artifacts created by pipelines and troubleshoot DLT syntax.

Skill Level
Associate
Duration
4h
Prerequisites
  • Beginner-level familiarity with basic cloud concepts (virtual machines, object storage, identity management)

  • Ability to perform basic code development tasks (create compute, run code in notebooks, use basic notebook operations, import repos from git, etc)

  • Intermediate familiarity with basic SQL concepts (CREATE, SELECT, INSERT, UPDATE, DELETE, WHILE, GROUP BY, JOIN, etc.)

Outline

    • The Medallion Architecture
    • Introduction to Delta Live Tables
    • Using the Delta Live Tables UI - PART 1 - Orders
    • Using the Delta Live Tables UI - PART 2 - Customers
    • Using the Delta Live Tables UI - PART 3 - Lab - Status
    • SQL pipelines
    • Python pipelines
    • Delta Live Tables Running Modes
    • Pipeline Results
    • Pipeline Event Logs

Upcoming Public Classes

Date
Time
Language
Price
Mar 31
01 PM - 05 PM (Europe/London)
English
$750.00
Mar 31
09 AM - 01 PM (America/New_York)
Spanish
$750.00
Apr 03
09 AM - 01 PM (America/New_York)
English
$750.00
Apr 04
09 AM - 01 PM (Asia/Kolkata)
English
$750.00
Apr 28
01 PM - 05 PM (Asia/Kolkata)
English
$750.00
Apr 30
01 PM - 05 PM (America/New_York)
English
$750.00
May 02
09 AM - 01 PM (Europe/London)
English
$750.00
Jun 02
09 AM - 01 PM (America/New_York)
English
$750.00
Jun 03
01 PM - 05 PM (Europe/London)
English
$750.00
Jun 06
09 AM - 01 PM (Asia/Kolkata)
English
$750.00
Jun 30
01 PM - 05 PM (Asia/Kolkata)
English
$750.00
Jul 01
01 PM - 05 PM (America/New_York)
English
$750.00
Jul 02
09 AM - 01 PM (Europe/London)
English
$750.00
Jul 02
01 PM - 05 PM (Europe/London)
English
$750.00
Jul 03
01 PM - 05 PM (Asia/Kolkata)
English
$750.00

Public Class Registration

If your company has purchased success credits or has a learning subscription, please fill out the Training Request form. Otherwise, you can register below.

Private Class Request

If your company is interested in private training, please submit a request.

See all our registration options

Registration options

Databricks has a delivery method for wherever you are on your learning journey

Runtime

Self-Paced

Custom-fit learning paths for data, analytics, and AI roles and career paths through on-demand videos

Register now

Instructors

Instructor-Led

Public and private courses taught by expert instructors across half-day to two-day courses

Register now

Learning

Blended Learning

Self-paced and weekly instructor-led sessions for every style of learner to optimize course completion and knowledge retention. Go to Subscriptions Catalog tab to purchase

Purchase now

Scale

Skills@Scale

Comprehensive training offering for large scale customers that includes learning elements for every style of learning. Inquire with your account executive for details

Upcoming Public Classes

Data Engineer

Automated Deployment with Databricks Asset Bundles

This course provides a comprehensive review of DevOps principles and their application to Databricks projects. It begins with an overview of core DevOps, DataOps, continuous integration (CI), continuous deployment (CD), and testing, and explores how these principles can be applied to data engineering pipelines.

The course then focuses on continuous deployment within the CI/CD process, examining tools like the Databricks REST API, SDK, and CLI for project deployment. You will learn about Databricks Asset Bundles (DABs) and how they fit into the CI/CD process. You’ll dive into their key components, folder structure, and how they streamline deployment across various target environments in Databricks. You will also learn how to add variables, modify, validate, deploy, and execute Databricks Asset Bundles for multiple environments with different configurations using the Databricks CLI.

Finally, the course introduces Visual Studio Code as an Interactive Development Environment (IDE) for building, testing, and deploying Databricks Asset Bundles locally, optimizing your development process. The course concludes with an introduction to automating deployment pipelines using GitHub Actions to enhance the CI/CD workflow with Databricks Asset Bundles.

By the end of this course, you will be equipped to automate Databricks project deployments with Databricks Asset Bundles, improving efficiency through DevOps practices.

Paid
4h
Lab
instructor-led
Professional

Questions?

If you have any questions, please refer to our Frequently Asked Questions page.