Two ways to use this template
- 1. Click "Copy prompt" below
- 2. Paste into Cursor, Claude Code, Codex, or any coding agent
- 3. Your agent builds the app — it asks questions along the way so the result is exactly what you want
Follow the steps below to set things up manually, at your own pace.
Onboard Your Coding Agent
Install Databricks agent skills (project-scoped), wire up the DevHub Docs MCP server, and bootstrap an AGENTS.md so your coding assistant knows this repo's workspace defaults.
Prerequisites
This template makes a Databricks repo agent-ready: it installs Databricks platform knowledge into the user's coding agent, wires up the DevHub Docs MCP server, and writes an AGENTS.md (with a symlinked CLAUDE.md) that pins the workspace defaults agents need to do the right thing on this codebase.
- A working Databricks CLI profile. The agent skills installer auto-detects installed coding agents and symlinks Databricks-specific instruction files into them — no auth required for the install itself, but every skill you install assumes a valid profile downstream. If
databricks auth profilesdoes not show aValid: YESprofile, run Set Up Your Local Dev Environment first. - A repo to onboard the agent into. Run this from the root of the project the agent will work on. If the user does not have a project yet, run Spin Up a Databricks App first and come back here from inside the scaffolded directory.
- A coding agent installed locally. The Databricks aitools installer detects Cursor, Claude Code, Codex CLI, OpenCode, GitHub Copilot, and Antigravity. The DevHub MCP server install via
npx add-mcpworks with the same set plus VS Code. npxavailable. The DevHub MCP install runs throughnpx add-mcp— comes with Node.js18+.- Knowledge of which Databricks resources this repo will use. Before writing
AGENTS.mdyou'll ask the user which CLI profile, workspace URL, Unity Catalog catalog/schema, Lakebase project/branch, Genie space, and Model Serving endpoint to treat as defaults for this repo. It is fine to leave fields blank with aTODO:marker if the user does not know yet.
Onboard Your Coding Agent
Make a Databricks repo agent-ready in four steps: install Databricks platform skills into the user's coding agent (project-scoped, so the rules ride with the repo), wire up the DevHub Docs MCP server so the agent can fetch any DevHub page on demand, and bootstrap an AGENTS.md (with a symlinked CLAUDE.md) that pins the workspace defaults this codebase should use.
References:
- Agent skills — what the Databricks skills give your agent and the full
databricks experimental aitoolsflag matrix. - Docs MCP Server — what the DevHub MCP server exposes and how to verify it is connected.
1. Install Databricks agent skills (project-scoped)
Skills are task-specific instruction files (databricks-apps, databricks-core, databricks-lakebase, databricks-pipelines, databricks-jobs, etc.) that tell the user's coding agent how Databricks works — CLI conventions, auth patterns, resource shapes — so it generates correct code instead of guessing.
By default, skills install globally to each agent's user-level config directory. For a repo handed off to a team, prefer project scope so the rules live alongside the code and travel with the repo:
databricks experimental aitools install --project
This installs every Databricks skill into the current project directory's agent config (e.g. .cursor/rules/, .claude/skills/). Run from the repo root.
If the user only wants a subset, scope by skill name and/or by agent:
databricks experimental aitools install --project --skills databricks-apps,databricks-lakebase --agents cursor,claude-code
Verify what got installed:
databricks experimental aitools list --project
databricks experimental aitools install --help is the source of truth for the flag list — DevHub mirrors it on the agent skills page but the CLI is authoritative.
2. Wire up the DevHub Docs MCP server
The DevHub MCP server gives coding agents read access to every page on dev.databricks.com (docs, templates, and examples) without leaving the editor. The agent can call list_docs_resources to see the index and get_doc_resource(slug) to fetch any page as markdown.
Install at project scope so the server is bound to this repo (drop -g if you want it user-wide):
npx add-mcp https://dev.databricks.com/api/mcp --name devhub-docs
To target a specific agent (otherwise the installer auto-detects):
npx add-mcp https://dev.databricks.com/api/mcp --name devhub-docs -a cursor
Restart the editor after installation. Some editors (Cursor) require visiting the MCP settings page and toggling devhub-docs to enabled.
Verify the connection:
- Confirm
devhub-docsshows up in the agent's tool list. - Ask the agent to call
list_docs_resources— it should return the DevHub markdown index. - Ask the agent to call
get_doc_resource(slug: "start-here")— it should return the start-here doc as markdown.
3. Bootstrap an AGENTS.md with this repo's Databricks defaults
AGENTS.md is the project-root file that any modern coding agent reads first to learn how to behave on this codebase. We want a "Working with Databricks" section that pins the resources and CLI profile this repo should default to, so the agent stops guessing on every prompt.
3a. Detect existing files first — do not overwrite
Before writing anything, check what already exists:
ls AGENTS.md CLAUDE.md 2>/dev/null
- Both exist as separate files → ask the user whether
CLAUDE.mdis a symlink toAGENTS.md(runls -l CLAUDE.md) or a divergent file. If divergent, surface that to the user and ask before merging — never overwrite hand-written agent instructions. - Only
AGENTS.mdexists → append the "Working with Databricks" section below. Thenln -s AGENTS.md CLAUDE.mdso Claude Code reads the same content. - Only
CLAUDE.mdexists → rename toAGENTS.md(mv CLAUDE.md AGENTS.md), then symlink (ln -s AGENTS.md CLAUDE.md). Append the section. - Neither exists → create
AGENTS.mdwith the template below, thenln -s AGENTS.md CLAUDE.md.
3b. Ask the user for this repo's defaults
Before generating AGENTS.md, walk through these questions one at a time (using a multiple-choice tool when available). Leave any answer blank with TODO: if the user does not know yet — you can fill it in later as the project develops. Do not infer or invent values.
- CLI profile for this repo (e.g.
DEFAULT,my-prod-workspace). Get the list withdatabricks auth profiles. - Workspace URL (the
Hostcolumn fromdatabricks auth profilesfor the chosen profile). - Unity Catalog defaults — catalog name and schema name this repo's tables should land in. Defaults are often
<team_name>_devand<project_name>respectively. - Lakebase defaults (only if the app uses Lakebase) — project name, branch (typically
production), database name, endpoint name (typicallyprimary). - Genie space ID (only if the app embeds Genie).
- Model Serving endpoint name (only if the app calls Databricks-hosted models).
3c. Write the section
Append this block to AGENTS.md (substituting the user's answers; keep TODO: markers for anything they did not specify):
## Working with Databricks
This repo deploys onto a single Databricks workspace. When suggesting CLI commands, infrastructure-as-code, or queries, default to these values unless the user asks for something else. Surface the assumption out loud whenever you act on one of them so the user can correct you.
**CLI profile**: `<PROFILE>` — pass `--profile <PROFILE>` on every `databricks` command, or set `export DATABRICKS_CONFIG_PROFILE=<PROFILE>` in the shell session.
**Workspace URL**: `<https://<workspace>.cloud.databricks.com>`
**Unity Catalog defaults**:
- Catalog: `<catalog>`
- Schema: `<schema>`
- Reference tables as `` `<catalog>.<schema>.<table>` `` in SQL.
**Lakebase defaults** _(only if the app persists data in Lakebase)_:
- Project: `<project>`
- Branch: `production`
- Database: `<database>`
- Endpoint: `primary`
**Genie space** _(only if the app uses conversational analytics)_:
- Space ID: `<space-id>`
**Model Serving endpoint** _(only if the app calls Databricks-hosted models)_:
- Endpoint name: `<endpoint>`
**Conventions**:
- Always run `databricks auth profiles` and confirm `<PROFILE>` shows `Valid: YES` before running anything that hits the workspace.
- For non-trivial destructive operations (`databricks apps delete`, `DROP TABLE`, etc.), ask the user to confirm before running.
- DevHub is the source of truth for the Databricks developer stack. When unsure of a CLI flag or a plugin shape, fetch the matching page from <https://dev.databricks.com/llms.txt> via the `devhub-docs` MCP server before guessing.
3d. Symlink CLAUDE.md → AGENTS.md
Claude Code (and Codex) read CLAUDE.md if present and AGENTS.md otherwise. To keep one source of truth:
ln -s AGENTS.md CLAUDE.md
On Windows without symlink support, copy instead and remind the user to keep them in sync (or use mklink /H CLAUDE.md AGENTS.md for a hard link in cmd).
Confirm:
ls -l AGENTS.md CLAUDE.md
CLAUDE.md should show as -> AGENTS.md.
4. Smoke-test the agent
Open a fresh chat with the user's coding agent and ask it:
Look at AGENTS.md and tell me which CLI profile and Unity Catalog schema this repo uses by default.
The agent should answer correctly without needing to fetch any extra context — that confirms the agent is reading AGENTS.md and the Databricks skills are loaded. If it cannot, re-check that the skill install step ran in the project directory (databricks experimental aitools list --project) and that AGENTS.md is at the repo root.
Where to next
- Templates catalog — pick a template, copy the prompt, and the agent now has full Databricks context to execute against your workspace defaults.
- DevHub Docs MCP Server reference — full tool list and connection troubleshooting.
- Agent skills reference — the full skill catalog plus
--global/--project/--agents/--skillsflag matrix.