Migrating Legacy SAS Code to Databricks Lakehouse: What We Learned Along the Way
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Data Warehousing |
Industry | Financial Services |
Technologies | Apache Spark, Delta Lake, Unity Catalog |
Skill Level | Intermediate |
Duration | 40 min |
In PacificSource Health Plans, a health insurance company in the US, we are on a successful multi-year journey to migrate all of our data and analytics ecosystem to Databricks Enterprise Data Warehouse (lakehouse).
A particular obstacle on this journey was a reporting data mart which relied on copious amounts of legacy SAS code that applied sophisticated business logic transformations for membership, claims, premiums and reserves. This core data mart was driving many of our critical reports and analytics.
In this session we will share the unique and somewhat unexpected challenges and complexities we encountered in migrating this legacy SAS code. How our partner (T1A) leveraged automation technology (Alchemist) and some unique approaches to reverse engineer (analyze), instrument, translate, migrate, validate and reconcile these jobs; and what lessons we learned and carried from this migration effort.
Session Speakers
Dmitriy Alergant
/Principal Architect
Tier One Analytics Inc.
Matt Adams
/Senior Data Platforms Developer
PacificSource Health Plans