Integrating SAP data into your Databricks Lakehouse
Enterprises are increasingly turning to a Lakehouse architecture for data management. This technology allows them to combine the benefits of flexibility and near-infinite scalability of the cloud with the governance and reliability of traditional data warehouses.
The Databricks Lakehouse is built on Delta Lake, an open source storage layer that brings reliability to data lakes with ACID transactions, scalable metadata handling, as well as unified streaming and batch data processing. It enables companies to cost-effectively unlock all their SAP ERP data for analytical purposes, and to combine them with live data sources such as IOT devices.
In this webinar you will learn how to:
- Extract and combine SAP master and transactional data into one Common Data Model (CDM), consumable by Databricks and other cloud services
- Use existing business logic by utilising SAP extractors via the ODP framework to perform both full- and delta-replications
- Transform the data in Databricks for different quality levels of a Lakehouse architecture: from ingested, to qualified and curated
- Utilise a SQL Analytics Workspace to compute aggregations using SQL and publish them to a dashboard