Skip to main content
All templates

Two ways to use this template

Use with your coding agent
  1. 1. Click "Copy prompt" below
  2. 2. Paste into Cursor, Claude Code, Codex, or any coding agent
  3. 3. Your agent builds the app — it asks questions along the way so the result is exactly what you want
or
Read step-by-step

Follow the steps below to set things up manually, at your own pace.

Agentic Support Console

End-to-end AI-powered support console combining Lakebase, Lakehouse Sync, a medallion pipeline, an LLM agent job, reverse sync, and a Databricks App with Genie analytics.

Agentic Support Console preview

Includes a working starter app

Real, runnable code lives on GitHub. When you copy the prompt above, your coding agent clones it as the starting point and adapts it to your data and use case.

examples/agentic-support-console/template/
View on GitHub

Agentic Support Console

This template brings together the full Databricks developer stack into a single operational data application: an AI-powered support console where every customer message is automatically triaged by an LLM, and support agents review, approve, or override the suggestion from a purpose-built internal tool.

Data Flow

Customer interactions flow from your application's OLTP database (Lakebase Postgres) through the lakehouse via CDC, get enriched by an AI agent, and are served back to the support console through reverse sync:

  1. OLTP writes land in Lakebase Postgres (users, orders, support cases, messages).
  2. Lakehouse Sync replicates every change into Unity Catalog as CDC history tables (bronze layer).
  3. A Lakeflow Declarative Pipeline transforms CDC history into current-state silver tables and analytical gold materialized views (daily revenue, support overview, user profiles, case context).
  4. A Lakeflow Job runs every minute, finds unanswered messages, builds rich context from gold tables, calls an LLM via AI Gateway, and merges suggested responses into a Delta table.
  5. Sync Tables (reverse sync) replicate gold tables back into Lakebase for sub-10ms reads.
  6. The Support Console (Databricks App) reads from both OLTP and synced gold tables to present cases, AI suggestions, and analytics.

What to Adapt

Provisioning (manual steps and SQL), seeding, pipeline deploys, reverse sync, and app deploy are documented in the repository’s template/README.md alongside the code.

To make this template your own:

  • Catalog: Set the catalog variable in each pipeline's databricks.yml to your Unity Catalog catalog name.
  • Lakebase: Point the app's databricks.yml at your own Lakebase project, branch, and database.
  • Tables: The seed script creates the OLTP schema. After seeding, configure Lakehouse Sync to replicate your public schema tables.
  • Sync Tables: Manually create the four reverse sync configurations (see the README for the exact table mappings).
  • AI Gateway: Set the endpoint variable to your preferred model serving endpoint.
  • Genie Space: Create a Genie space over your gold tables and set the genie_space_id in the app bundle.

Built on these templates

This example's codebase and the agent prompt above both build on top of the templates below. Open one to dive into a specific technique on its own or apply it to a different project.

Template
Operational Data Analytics

End-to-end setup for analyzing operational database data in the lakehouse: Unity Catalog with external storage, Lakebase provisioning, Lakehouse Sync CDC replication, and a medallion architecture pipeline with silver and gold layers.

Template
App with Lakebase

Wire up a Databricks App with Lakebase for persistent data storage. Includes schema setup and full CRUD API routes.

Template
Genie Conversational Analytics

Embed a Databricks AI/BI Genie chat interface so users can explore data through natural language. Configure a Genie space, wire up server and client plugins, declare app resources, and deploy.

Template
Query AI Gateway Endpoints

Query AI Gateway endpoints for production-ready access to foundation models with built-in governance.