How to Build a Lakehouse

Every business today wants to leverage data to drive strategic initiatives with machine learning, data science and analytics — but runs into challenges from siloed teams, proprietary technologies and unreliable data.

That’s why enterprises are turning to the lakehouse because it offers a single platform to unify all your data, analytics and AI workloads.

Join our How to Build a Lakehouse technical training, where we’ll explore how to use Apache SparkTM, Delta Lake, and other open source technologies to build a better lakehouse. This virtual session will include concepts, architectures and demos.

Here’s what you’ll learn in this 2-hour session:

  • How Delta Lake combines the best of data warehouses and data lakes for improved data reliability, performance and security
  • How to use Apache Spark and Delta Lake to perform ETL processing, manage late-arriving data, and repair corrupted data directly on your lakehouse

Speakers


Doug Bateman
Master Instructor, Databricks

Watch On-demand