Build with AI
The Databricks Partner AI Dev-Kit is an open-source toolkit that gives your AI coding assistant the patterns and rules needed to generate PWAF-compliant Databricks connectors. Instead of reading the docs and writing code manually, you describe what you need and your AI assistant generates a connector with authentication, telemetry, and compliance built in.
The kit works with Claude Code, Cursor, Codex, and any MCP-compatible AI coding assistant.
What you can build
| Use case | Connector | Languages |
|---|---|---|
| Run SQL queries against a SQL warehouse | SQL drivers and connectors | Python, Java, Go, Node.js |
| Call workspace APIs (jobs, Unity Catalog, clusters) | Databricks SDKs | Python, Java, Go |
| Run Spark workloads from outside Databricks | Databricks Connect | Python |
| Make direct HTTP calls in any language | REST API | Any |
| Use an ORM against a SQL warehouse | SQLAlchemy | Python |
Every generated connector covers all four authentication flows and includes built-in User-Agent telemetry.
How it works
Step 1 — Install Clone the repository and run the installer, or connect via MCP server (no clone required).
Step 2 — Pick a skill Each connector type has a dedicated skill. Tell your AI assistant which skill to use, or let the MCP server serve it on demand.
Step 3 — Generate Run the connector's build prompt. Your AI assistant reads the skill, generates the connector code, configures authentication, sets the User-Agent string, and writes the test runner.
Step 4 — Validate Run the automated PWAF compliance check. A fully compliant connector scores 12/12.
Getting started
Option A — MCP server (no cloning required)
Install the dependency:
pip install mcp
Add to your .mcp.json (Claude Code) or ~/.cursor/mcp.json (Cursor):
{
"mcpServers": {
"databricks-pwaf": {
"command": "python",
"args": ["-m", "databricks_pwaf_mcp"],
"cwd": "/path/to/databricks-partner-ai-dev-kit/mcp"
}
}
}
Skills are fetched from GitHub automatically on first use and cached for 24 hours.
Option B — Clone and reference directly
git clone https://github.com/databricks-solutions/partner-ai-dev-kit.git databricks-partner-ai-dev-kit
cd databricks-partner-ai-dev-kit
bash install.sh
Point your AI assistant at the skills/ directory. For Claude Code, add to CLAUDE.md; for Cursor, add to .cursor/rules/.
Supported stacks
| Language | Connectors |
|---|---|
| Python | SDK, SQL Connector, SQLAlchemy, Databricks Connect |
| Java | SDK, JDBC |
| Go | SDK, SQL Driver |
| Node.js | SQL Driver |
| Any | REST API |
Authentication coverage
All skills generate connectors that implement all four authentication types required for PWAF validation:
| Auth type | Use case |
|---|---|
| Personal Access Token (PAT) | Simple, user-specific access |
| OAuth M2M | Machine-to-machine, service principals, backend systems |
| OAuth U2M | Interactive, browser-based, user-delegated access |
| Token pass-through | Pre-obtained tokens, CI/CD, headless environments |
The auth type is selected at runtime via APP_AUTH_TYPE — never DATABRICKS_AUTH_TYPE.
PWAF compliance check
Every generated connector ships with an automated compliance check that validates all PWAF requirements:
validate_pwaf_tool("/path/to/your/connector")
A fully compliant connector returns:
{
"overall": "PASS",
"summary": { "PASS": 12, "FAIL": 0, "WARN": 0, "SKIP": 0 }
}
The checks cover User-Agent telemetry, all four auth types, isolated test runner, no hardcoded credentials, .env.template present, and a build report.
What's next
- Review integration requirements: Understand what Databricks requires for partner validation. See Integration Requirements.
- Explore telemetry attribution: Learn the User-Agent format the Dev-Kit implements for you. See Telemetry & Attribution.
- Get the toolkit: Clone, install, and run your first build prompt. See Partner AI Dev-Kit.