Partner product categories
Each category page outlines integration patterns, best practices, and requirements specific to your product type, including the standards required for validation.
For foundational technical patterns that apply across all categories, see Databricks Design Patterns.
Category overview
| Category | Description | Key Focus Areas |
|---|---|---|
| Data Engineering | Partners building data ingestion, ETL/ELT, and data transformation tools | Data ingestion, transformation, pipeline orchestration, Unity Catalog integration |
| AI/ML | Partners building AI, machine learning, and model management solutions | MLflow integration, Unity Catalog model registry, foundational models (LLM's), vector search, feature engineering |
| Business Intelligence | Partners building BI, analytics, and visualization tools | SQL Warehouses, query pushdown, OAuth, CloudFetch, Genie integration |
| Governance & Observability | Partners building data governance, cataloging, security, and observability tools | Unity Catalog integration, lineage, access control, audit logging, data security |
| Apps & Dev Tools | Partners building development tools and applications | Databricks SDKs, REST APIs, Connect, Apps framework |
Common integration requirements
All partner integrations must follow the foundational integration requirements:
- Unity Catalog integration - Register all data assets in Unity Catalog for consistent governance
- OAuth authentication - Use OAuth 2.0 for secure, token-based authentication
- User-Agent telemetry - Implement User-Agent tagging for attribution and monitoring (see Telemetry & Attribution)
In addition to these requirements, partner integrations must adhere to the product category-specific best practices laid out in this section.