Integration requirements
This page describes the minimum technical requirements for validated Technology Partner integrations with Databricks. Integrations that do not meet these requirements will not pass Databricks validation.
All requirements apply across all integration modalities—APIs, SDKs, drivers/connectors, jobs, CLIs, or other tooling.
| Requirement | Summary | Details |
|---|---|---|
| Telemetry | Programmatic tagging to provide consistent telemetry across all integration modalities | Telemetry & Attribution |
| OAuth | Support Token Federation, U2M with PKCE, and Databricks M2M | Access & Authentication |
| Unity Catalog | Use UC namespaces and interfaces; respect ACLs; use UC Volumes for staging; publish metadata/lineage | Catalog & Metadata |
All integrations must provide a guided and systematic setup process between the partner's platform and the customer's Databricks environment, and adhere to best practices outlined in the Databricks Design Patterns and Partner Product Categories.
Telemetry
Telemetry is required for all partner integrations, enabling Databricks and its partners to understand integration usage, measure joint business impact, identify joint customers, and quickly diagnose operational issues across environments.
Partner User-Agent (required) — Programmatically embed a stable partner User-Agent telemetry identifier in every Databricks API call, driver/SQL connection, SDK/connector call, job, CLI invocation, or other integration request originating from your product. Databricks uses this for partner attribution and reporting purposes.
For details on implementing telemetry, see Telemetry & Attribution.
OAuth
OAuth is mandatory for all ISV Partner integrations with Databricks and is the standard mechanism for secure production authentication. Databricks supports OAuth 2.0 flows for both user-driven (U2M) and automated (M2M) interactions, allowing partners to protect customer data while enabling both interactive and programmatic access.
Supported OAuth flows
| Industry term | Databricks term | Use case |
|---|---|---|
| User Interactive Flow (Authorization Code Flow with PKCE) | User-to-Machine (U2M) | Desktop apps, SaaS/cloud apps, multi-user scenarios |
| Client Credentials Flow | Machine-to-Machine (M2M) | Automation, service principals, federated workloads |
User interactive flow (User-to-Machine)
Used when a user authenticates with their own identity through a browser-based OAuth flow, allowing the partner app to act on their behalf.
| App Type | Pattern | Documentation |
|---|---|---|
| Desktop / single-user | Local loopback OAuth (localhost redirect, 1-hour TTL) | JDBC | ODBC |
| SaaS / multi-user | OIDC U2M or Token Federation (required for headless scenarios) | OIDC U2M | Token Federation |
Client credentials flow (Machine-to-Machine)
Enables automated, non-interactive authentication between systems, typically using service identities or principals (SP), so jobs, services, and scripts can access Databricks without a signed-in user.
| Option | Description |
|---|---|
| Workload Identity Federation (WIF) | Per-SP trust policies; eliminates secrets (recommended) |
| Account Token Federation | Account-wide federation for unified setup |
| Databricks OIDC M2M | Uses long-lived secrets; use only when customer doesn't support federated identity |
Security and lifecycle recommendations
- Request least-privilege scopes
- Persist tokens securely
- Rotate refresh tokens
- Use PKCE for added protection
- Never log tokens (ensure redaction)
- Support revocation
- Provide a clear disconnect/re-auth experience
Unity Catalog
If an ISV integration reads, writes, stages, or manages any type of data asset, then the integration must register and operate on those assets in Unity Catalog.
Data assets managed by UC: Structured/tabular data (tables, views, materialized views, metric views, UDFs), unstructured files (images, documents, logs, binaries, ingestion files), and ML models.
Mandatory practices
| Practice | Requirement |
|---|---|
| UC semantics | Operate on the three-level namespace <catalog>.<schema>.<table> and use UC interfaces (APIs, drivers, connectors, or UI) so permissions, audit logs, and metadata are preserved |
| Least privilege | Design workflows for non-admin users and document the minimum privileges customers must grant |
| Staging / non-tabular data | Use Unity Catalog Volumes for governed staging. Flow: customer creates catalog and grants permission → partner creates schema/volume → partner writes files → ingest into Delta. Staging to cloud storage buckets isn't eligible for validation |
| Metadata and lineage | Read and publish schema, tags, and lineage via UC so customers retain a single source of truth. See Catalog & Metadata |
Documentation: Unity Catalog | Data Governance
What's next
- Implement telemetry: Learn how to configure User-Agent attribution for your integration. See Telemetry & Attribution.
- Explore integration patterns: Understand architectural patterns for partner integrations. See Integration Patterns.
- Review product categories: Find specific requirements for your product type. See Partner Product Categories.