Virtual Event Series
Databricks Specialist Sessions

Dive Deeper into Databricks
As an experienced Databricks user, we’d like to invite you to take a deep dive into a range of topics to help maximise the value and performance of the Databricks platform.
Each Databricks Specialist Session will take an in-depth look at the key challenges you may encounter day-to-day, and how to solve them. Each month, we will cover a different technical subject in depth, including best practices for governance, geospatial data processing, data warehousing and streaming, as well as a detailed look at the platform architecture.
Upcoming Sessions
- Databricks Apps, 14th October
- Managing Databricks at scale using Terraform, 25th November
Session Overview
14 October 2025, 10:00 AM BST | 11:00 AM CEST
Databricks Apps
This session provides an in-depth exploration of Databricks application development. It guides participants through environment setup and practical techniques for leveraging the Databricks framework to build custom apps. Attendees will gain hands-on experience in developing applications, with a focus on architectural best practices, robust security measures, and the latest product advancements relevant to enterprise data environments.
The agenda covers an introduction to Databricks apps, strategies for effective app development and insights into platform architecture and security.
25 November 2025, 10:00 AM GMT | 11:00 AM CET
Managing Databricks at scale using Terraform
This session provides a practical overview of using Terraform to manage Databricks infrastructure at scale, including workspaces, Unity Catalog, and core resources. Participants will gain a solid understanding of why Infrastructure as Code and tools like Terraform are essential for scalable, reproducible, and manageable enterprise data operations.
The agenda covers considerations for designing and deploying Databricks environments with Terraform, setting up resources, structuring deployments for scalability, and evaluating asset management options. Attendees will leave with actionable knowledge on code modularisation, state management, and promoting configurations across multiple environments, ensuring efficient and robust operations for data teams.