Partner Hosted (SaaS)
In the SaaS model, you (the partner) own and operate the entire Databricks environment—workspaces, compute, storage, and data pipelines. Your customers access your product as a service without needing their own Databricks account.
This model enables partners to deliver turnkey data and AI experiences where Databricks powers the backend intelligence while remaining invisible to end users. Partners maintain full control over the platform experience, data architecture, and operational characteristics.
Reference implementation: See how Firefly Analytics implements a Partner Hosted SaaS architecture with architecture overview and complete request flow showing SSO-SPN authentication in action. For application architecture guidance, see the 5-layer architecture on the deployment models overview page.
Key Characteristics of SaaS Architectures
| Aspect | Description |
|---|---|
| Infrastructure | You control all infrastructure and operations |
| Customer experience | Customers consume the service, not the platform |
| Multi-tenancy | You implement tenant isolation and data segregation |
| Security & compliance | You manage all security controls and compliance requirements |
| Databricks account | Customers do not need their own Databricks account |
| Data location | Data resides in partner-controlled environment |
Common Use Cases
Analytics Hub
Partners deliver realtime analytics, interactive dashboards, and business intelligence embedded directly into their application workflows. Customers get insights and recommended actions without leaving the partner's interface or managing data infrastructure. Databricks SQL warehouses power sub-second query performance at scale across millions of records, enabling partners to deliver self-service analytics experiences where users can explore data, build custom reports, and derive insights through the partner's native interface.
Customer outcome: Self-service insights and data exploration with realtime performance
Databricks capabilities: SQL Warehouses, Unity Catalog, Dashboards
Data Transformation Engine
Partners use Databricks pipelines and workflows as the backend processing engine powering their applications. Databricks automates streaming ingestion, log aggregation, data cleansing, transformation, enrichment, and validation—transforming raw data into the processed datasets that drive the partner's product features. Customers interact with the partner's application while Databricks Workflows orchestrate complex multi-stage data processing behind the scenes, handling everything from real-time event streams to batch enrichment at scale.
Customer outcome: Application features powered by continuously processed, high-quality data
Databricks capabilities: Jobs, Workflows, Delta Lake, Data Quality, Streaming
AI-Powered Applications
Partners embed machine learning and generative AI directly into their product experience—delivering predictions, recommendations, LLM-powered agents, anomaly detection, and intelligent automation. Databricks handles model training, feature engineering, and real-time inference at scale. Customers interact with AI-driven insights and intelligent assistants without seeing the underlying ML infrastructure, from fraud detection to threat analysis to next-best-action recommendations.
Customer outcome: Intelligent, predictive, and automated decision-making
Databricks capabilities: MLflow, Model Serving, Feature Engineering, Agents, Foundation Models
What's Next
- SaaS Workspace Models — Multi-tenant vs per-customer workspace design
- Cost Management — Tagging and attribution strategies
- Governance — Unity Catalog patterns for multi-tenant deployments
- Automation — Infrastructure as code for workspace provisioning