Fundamentals of Delta Lake
Explore the fundamental concepts behind Delta Lake.
Today, many organizations struggle with achieving successful big data and artificial intelligence (AI) projects. One of the biggest challenges they face is ensuring that quality, reliable data is available to data practitioners running these projects. After all, an organization that does not have reliable data will not succeed with AI. To help organizations bring structure, reliability, and performance to their data lakes, Databricks created Delta Lake.
Delta Lake is an open format storage layer that sits on top of your organization’s data lake. It is the foundation of a cost-effective, highly scalable Lakehouse and is an integral part of the Databricks Lakehouse Platform.
In this course, we’ll break down the basics behind Delta Lake – what it does, how it works, and why it is valuable from a business perspective, to any organization with big data and AI projects.
Note: This is an introductory-level course that will *not* showcase in-depth technical Delta Lake demos nor provide hands-on technical training with Delta Lake. Please see the Delta Lake Rapidstart courses available in the Databricks Academy for technical training on Delta Lake.
- Describe how Delta Lake fits into the Databricks Lakehouse Platform.
- Explain the four elements encompassed by Delta Lake.
- Summarize high-level Delta Lake functionality that helps organizations solve common challenges related to enterprise-scale data analytics.
- Articulate examples of how organizations have employed Delta Lake on Databricks to improve business outcomes.
Beginning knowledge of the Databricks Lakehouse Platform. We recommended taking the course Fundamentals of the Databricks Lakehouse Platform prior to taking this course.
- This course is part of all Databricks Academy learning paths.
Proof of completion
- Upon 80% completion of this course, you will receive a proof of completion.