Integration patterns
Databricks supports a wide range of integrations to allow external tools, applications, and developers to interact with the Lakehouse. The integrations are generally categorized by how they connect (protocol) and who uses them (persona).
ALWAYS use the officially recommended drivers from Databricks documentation. For each integration pattern below, refer to the linked official documentation for the current recommended driver. Never use deprecated or third-party alternatives unless explicitly documented by Databricks.
| Pattern | Use Case | Telemetry |
|---|---|---|
| SQL drivers | BI tools, database clients, SQL queries | User-Agent |
| Databricks SDKs | Infrastructure as code, application development | User-Agent |
| CLI & DABs | Command-line management, CI/CD deployments | Built-in |
| REST APIs | Direct HTTP calls, custom integrations | User-Agent |
| Libraries & UDFs | Custom code on clusters, SQL functions | Library name |
SQL drivers
ODBC
The Databricks ODBC Driver enables you to connect apps, tools, and clients to Databricks through Open Database Connectivity (ODBC). See the ODBC Driver Guide for configuration details.
JDBC
The Databricks JDBC Driver (version 3.x) is the standard driver for all new applications. It is the official, open-source driver maintained by Databricks and provides modern features including native OAuth support, Cloud Fetch, and full Unity Catalog integration.
It enables you to connect tools such as DataGrip, DBeaver, and SQL Workbench/J to Databricks through Java Database Connectivity (JDBC). The driver supports native query mode, parameterized queries, and can run using Statement Execution APIs or Thrift.
For new applications, use the Databricks JDBC Driver (version 3.x), not the legacy Simba JDBC driver. When configuring OAuth M2M authentication with the 3.x driver, use these properties: AuthMech=11 | Auth_Flow=1 | OAuth2ClientId | OAuth2Secret | UserAgentEntry | SSL=1
Resources: Databricks JDBC Driver Download | Databricks JDBC Driver Documentation | JDBC Telemetry Configuration
Node.js
The Databricks SQL Driver for Node.js is a Node.js library that allows you to use JavaScript code to run SQL commands on Databricks compute resources.
Golang
The Databricks SQL Driver for Go is a Go library that allows you to use Go code to run SQL commands on Databricks compute resources. This article supplements the Databricks SQL Driver for Go README, API reference, and examples.
Documentation: ODBC | JDBC documentation | JDBC download | Node.js | Golang
User-Agent telemetry: See SQL drivers for configuration details.
Databricks SDKs
The SDKs provide a programmatic wrapper around the Databricks REST APIs, offering strong typing, automatic authentication handling, and retry logic. They are available for Python, Go, and Java.
The preferred method for infrastructure as code and application development.
Documentation: Python SDK | Go SDK | Java SDK
User-Agent telemetry: See SDKs for configuration details.
Databricks CLI and DABs
The Databricks CLI is a command-line tool that wraps the SDKs for interactive management and automated deployments. Databricks Asset Bundles (DABs) are the modern evolution of the CLI, allowing you to define complex data projects (jobs, pipelines, infrastructure) as code (YAML) and deploy them via CI/CD.
Documentation: Databricks CLI | Databricks Asset Bundles (DABs)
User-Agent telemetry: The CLI sends a distinct User-Agent header identifying the CLI version. This is not customizable as it identifies the tool itself.
Databricks REST APIs
This is the lowest-level interface. Everything else (SDKs, CLI, UI) is ultimately built on top of these REST endpoints.
Direct HTTP endpoints for managing resources (clusters, jobs, Unity Catalog) and executing commands. Useful when no SDK exists for your language.
Documentation: Databricks REST API Reference
User-Agent telemetry: When calling the API directly, include a User-Agent header in the format <product-name>/<version> (<comment>). See REST APIs for configuration details.
Libraries and UDFs
These integrations run inside the Databricks compute runtime. You can bring your own code (Python wheels, JARs) or define custom functions (UDFs) in Unity Catalog that extend SQL and Python capabilities.
- Libraries: Install Python wheels, JARs, or R packages from PyPI, Maven, CRAN, or Unity Catalog volumes. Can be compute-scoped or notebook-scoped.
- UDFs: Create scalar or table functions in Unity Catalog that can be called from SQL or other languages.
Documentation: Cluster Libraries | Unity Catalog UDFs
User-Agent telemetry: See Libraries for configuration details.
What's next
- Review Integration Requirements for partner integration standards
- Explore Telemetry & Attribution for User-Agent configuration details
- Browse Partner Product Categories to see integration patterns by product type
- Learn about Databricks Design Patterns for authentication and data access patterns