Skip to main content

Databricks Apps

Databricks Apps enables developers to build and deploy secure data and AI applications directly on the Databricks platform, eliminating the need for separate infrastructure. Apps are hosted on the Databricks serverless platform and integrate with key services:

  • Unity Catalog for data governance
  • Databricks SQL for querying data
  • Model Serving for deploying AI models
  • Lakeflow Jobs for ETL and automation
  • OAuth and service principals for authentication

Apps support Python frameworks (Streamlit, Dash, Gradio) and Node.js frameworks (React, Angular, Express). Common use cases include interactive data visualizations, RAG chat apps, custom configuration interfaces, and business process automation.

Documentation: Databricks Apps

Partner applications

Partners can leverage Databricks Apps to:

  • Build data-centric experiences that run directly on customer Lakehouse environments
  • Create custom UIs for partner-specific workflows that integrate with Unity Catalog and Databricks services
  • Deploy partner tools as managed apps within customer workspaces

Development lifecycle

There are two stages to configuring app resources:

  • Declaration (development) - Declare each required resource in the databricks.yml manifest. This defines which resources the app needs and what permissions it requires.
  • Configuration (deployment) - During deployment, configure the declared resources with actual workspace-specific instances (e.g., selecting a specific SQL warehouse).

This separation allows apps to be portable across customer environments. Deploy the same app code to different workspaces with different resource configurations.

Integration considerations

  • Apps inherit Unity Catalog governance, so partner apps automatically respect customer data permissions
  • Apps are only accessible to authenticated Databricks users within the account (no public/anonymous access)
  • Databricks enforces least-privilege access; apps must use existing resources and cannot create new ones
  • Avoid hardcoding resource IDs for portability across customer environments

For partner-hosted applications that run outside Databricks, see Apps & Dev Tools.

What's next