Skip to main content

Telemetry & Attribution Overview

Telemetry is essential for assessing partnership performance within Databricks Technology Partner integrations. User-Agent identifiers enable Databricks and partners to monitor integration usage, attribute activity to specific products, and gain insights into joint customer patterns.

What is a User-Agent?

A User-Agent in Databricks Technology (ISV) partner integrations is a unique identifier associated with your partner product when integrating with Databricks. This identifier is applied to connections established through:

  • Lakehouse APIs
  • SDKs
  • SQL connectors and drivers
  • Lakebase integrations

The User-Agent allows usage to be attributed back to your product.

note

For library-based integrations, attribution is based on the library or class name rather than a runtime connection parameter.

Why is a User-Agent required?

Databricks requires User-Agent tagging in partner integrations for three primary reasons:

PurposeDescription
Telemetry and usage attributionEnables Databricks to track partner product usage and gather metrics for analysis and reporting
Joint customer attributionProvides visibility into how many joint customers use a given integration and how they interact with it
Go-to-market (GTM) initiativesSupports measurement of integration adoption and effectiveness of collaborative partner activities

User-Agent format

Partners must use the following format:

<isv-name_product-name>/<product-version>

Example: For a partner named "AcmePartner" with a product named "DatEngProduct" at version 3.5:

AcmePartner_DatEngProduct/3.5

Format guidelines

ComponentRequiredDescription
isv-nameYesYour company name. Must be kept consistent across all partner products
product-nameYesThe product used in this integration
product-versionOptionalThe product version. Including this improves traceability for both you and Databricks
important

Partners must use the underscore (_) as the separator in the User-Agent string. This format ensures consistent attribution across all connectors, drivers, and languages. If your integration has issues handling the underscore, contact the Databricks Partner Engineering team for review.

Each product or integration must have a distinct User-Agent value. Partners must set this value programmatically in the Databricks connection code path (for example, JDBC, ODBC, SDK). You can't rely on joint customers to configure it themselves—this is a strict requirement for Databricks partner integrations.

User-Agent validation

Partners can verify that the User-Agent is implemented correctly using one of two methods:

Query History UI

Query History displays workloads from Databricks SQL, serverless compute, and the SQL Execution API. The Source column reflects the User-Agent used by the partner application.

system.access.audit table

For comprehensive verification across all workloads, query the system.access.audit system table:

SELECT *
FROM system.access.audit
WHERE event_time > current_timestamp() - INTERVAL 2 days
AND lower(user_agent) LIKE '%<your-user-agent>%';
note

Request access to system tables if needed.

Supported integration mechanisms

Most Databricks integration types support implementing the User-Agent tag. The tables below outline the supported integration types with links to detailed setup guidance.

Lakehouse integrations

Integration TypeComponents
SQL DriversJDBC, ODBC, Node.js, Go
SQL Connectors and FrameworksPython SQL Connector, SQLAlchemy, PyODBC
SDKsDatabricks SDK for Python, Go, Java
REST APIsUnity Catalog REST APIs, Iceberg Integration
Databricks ConnectScala, Python
Libraries / UDFsPython or Scala Libraries

Lakebase integrations

Integration TypeComponents
Lakebase IntegrationsPostgres JDBC, psql Client, psycopg2/3, SQLAlchemy

What's next

  • Configure SQL drivers: Set up User-Agent telemetry for JDBC, ODBC, Node.js, and Go drivers. See SQL Drivers.
  • Configure SDKs: Implement User-Agent attribution in Databricks SDKs for Python, Go, and Java. See SDKs.
  • Review integration requirements: Understand all requirements for Databricks partner integrations. See Integration Requirements.