Session

Breaking Barriers: Building Custom Spark 4.0 Data Connectors with Python

Overview

ExperienceIn Person
TypeBreakout
TrackData Engineering and Streaming
IndustryEnterprise Technology, Professional Services, Financial Services
TechnologiesApache Spark, Unity Catalog
Skill LevelIntermediate
Duration40 min

Building a custom Spark data source connector once required Java or Scala expertise, making it complex and limiting. This left many proprietary data sources without public SDKs disconnected from Spark. Additionally, data sources with Python SDKs couldn't harness Spark’s distributed power.

 

Spark 4.0 changes this with a new Python API for data source connectors, allowing developers to build fully functional connectors without Java or Scala. This unlocks new possibilities, from integrating proprietary systems to leveraging untapped data sources. Supporting both batch and streaming, this API makes data ingestion more flexible than ever.

 

In this talk, we’ll demonstrate how to build a Spark connector for Excel using Python, showcasing schema inference, data reads/writes and streaming support. Whether you're a data engineer or Spark enthusiast, you’ll gain the knowledge to integrate Spark with any data source — entirely in Python.

Session Speakers

Sourav Gulati

/Senior Resident Solutions Architect
Databricks

Ashish Saraswat

/Resident Solutions Architect
Databricks