Skip to main content

Partner product categories

Each category page outlines integration patterns, best practices, and requirements specific to your product type, including the standards required for validation.

For foundational technical patterns that apply across all categories, see Databricks Design Patterns.

Category overview

CategoryDescriptionKey Focus Areas
Data EngineeringPartners building data ingestion, ETL/ELT, and data transformation toolsData ingestion, transformation, pipeline orchestration, Unity Catalog integration
AI/MLPartners building AI, machine learning, and model management solutionsMLflow integration, Unity Catalog model registry, foundational models (LLM's), vector search, feature engineering
Business IntelligencePartners building BI, analytics, and visualization toolsSQL Warehouses, query pushdown, OAuth, CloudFetch, Genie integration
Governance & ObservabilityPartners building data governance, cataloging, security, and observability toolsUnity Catalog integration, lineage, access control, audit logging, data security
Apps & Dev ToolsPartners building development tools and applicationsDatabricks SDKs, REST APIs, Connect, Apps framework

Common integration requirements

All partner integrations must follow the foundational integration requirements:

  • Unity Catalog integration - Register all data assets in Unity Catalog for consistent governance
  • OAuth authentication - Use OAuth 2.0 for secure, token-based authentication
  • User-Agent telemetry - Implement User-Agent tagging for attribution and monitoring (see Telemetry & Attribution)

In addition to these requirements, partner integrations must adhere to the product category-specific best practices laid out in this section.