Telemetry & Attribution Overview
Telemetry is essential for assessing partnership performance within Databricks Technology Partner integrations. User-Agent identifiers enable Databricks and partners to monitor integration usage, attribute activity to specific products, and gain insights into joint customer patterns.
What is a User-Agent?
A User-Agent in Databricks Technology (ISV) partner integrations is a unique identifier associated with your partner product when integrating with Databricks. This identifier is applied to connections established through:
- Lakehouse APIs
- SDKs
- SQL connectors and drivers
- Lakebase integrations
The User-Agent allows usage to be attributed back to your product.
For library-based integrations, attribution is based on the library or class name rather than a runtime connection parameter.
Why is a User-Agent required?
Databricks requires User-Agent tagging in partner integrations for three primary reasons:
| Purpose | Description |
|---|---|
| Telemetry and usage attribution | Enables Databricks to track partner product usage and gather metrics for analysis and reporting |
| Joint customer attribution | Provides visibility into how many joint customers use a given integration and how they interact with it |
| Go-to-market (GTM) initiatives | Supports measurement of integration adoption and effectiveness of collaborative partner activities |
User-Agent format
Partners must use the following format:
<isv-name_product-name>/<product-version>
Example: For a partner named "AcmePartner" with a product named "DatEngProduct" at version 3.5:
AcmePartner_DatEngProduct/3.5
Format guidelines
| Component | Required | Description |
|---|---|---|
isv-name | Yes | Your company name. Must be kept consistent across all partner products |
product-name | Yes | The product used in this integration |
product-version | Optional | The product version. Including this improves traceability for both you and Databricks |
Partners must use the underscore (_) as the separator in the User-Agent string. This format ensures consistent attribution across all connectors, drivers, and languages. If your integration has issues handling the underscore, contact the Databricks Partner Engineering team for review.
Each product or integration must have a distinct User-Agent value. Partners must set this value programmatically in the Databricks connection code path (for example, JDBC, ODBC, SDK). You can't rely on joint customers to configure it themselves—this is a strict requirement for Databricks partner integrations.
User-Agent validation
Partners can verify that the User-Agent is implemented correctly using one of two methods:
Query History UI
Query History displays workloads from Databricks SQL, serverless compute, and the SQL Execution API. The Source column reflects the User-Agent used by the partner application.
system.access.audit table
For comprehensive verification across all workloads, query the system.access.audit system table:
SELECT *
FROM system.access.audit
WHERE event_time > current_timestamp() - INTERVAL 2 days
AND lower(user_agent) LIKE '%<your-user-agent>%';
Request access to system tables if needed.
Supported integration mechanisms
Most Databricks integration types support implementing the User-Agent tag. The tables below outline the supported integration types with links to detailed setup guidance.
Lakehouse integrations
| Integration Type | Components |
|---|---|
| SQL Drivers | JDBC, ODBC, Node.js, Go |
| SQL Connectors and Frameworks | Python SQL Connector, SQLAlchemy, PyODBC |
| SDKs | Databricks SDK for Python, Go, Java |
| REST APIs | Unity Catalog REST APIs, Iceberg Integration |
| Databricks Connect | Scala, Python |
| Libraries / UDFs | Python or Scala Libraries |
Lakebase integrations
| Integration Type | Components |
|---|---|
| Lakebase Integrations | Postgres JDBC, psql Client, psycopg2/3, SQLAlchemy |
What's next
- Configure SQL drivers: Set up User-Agent telemetry for JDBC, ODBC, Node.js, and Go drivers. See SQL Drivers.
- Configure SDKs: Implement User-Agent attribution in Databricks SDKs for Python, Go, and Java. See SDKs.
- Review integration requirements: Understand all requirements for Databricks partner integrations. See Integration Requirements.