SDKs
This guide covers User-Agent configuration for Databricks SDKs. Each SDK provides functions to register partner and product identifiers that are applied globally to all SDK requests.
Databricks SDK for Python
With the Python SDK, partners must use the useragent.with_partner() and useragent.with_product() functions to register their product and ISV identifiers:
import os
from databricks.sdk import WorkspaceClient, useragent
from databricks.sdk.core import Config
# Set partner and product identifiers
useragent.with_partner("<isv-name>")
useragent.with_product("<product-name>", "<product-version>")
# Auth from env
cfg = Config(
host=os.getenv("DATABRICKS_HOST"),
client_id=os.getenv("DATABRICKS_CLIENT_ID"),
client_secret=os.getenv("DATABRICKS_CLIENT_SECRET"),
auth_type="oauth-m2m",
)
# Workspace client
w = WorkspaceClient(config=cfg)
# Quick test
for s in w.schemas.list(catalog_name="samples"):
print("-", s.name)
note
with_partner()identifies the ISV or integration ownerwith_product()specifies the product name and version (version must follow SemVer)
Databricks SDK for Go
With the Go SDK, partners must use the WithPartner() and WithProduct() functions:
package main
import (
"github.com/databricks/databricks-sdk-go"
"github.com/databricks/databricks-sdk-go/useragent"
)
func main() {
useragent.WithPartner("<isv-name>")
useragent.WithProduct("<product-name>", "<product-version>")
w := databricks.NewWorkspaceClient()
// Example API call...
}
note
WithPartner()identifies the ISV or integration ownerWithProduct()sets the product name and version (must be a valid SemVer version)
Databricks SDK for Java
For the Databricks SDK for Java, use the UserAgent.withProduct() and UserAgent.withPartner() methods:
package com.example;
import com.databricks.sdk.core.DatabricksConfig;
import com.databricks.sdk.WorkspaceClient;
import com.databricks.sdk.service.catalog.CatalogInfo;
import com.databricks.sdk.service.catalog.ListCatalogsRequest;
import com.databricks.sdk.core.UserAgent;
public class ListCatalogsExample {
public static void main(String[] args) {
// Load configuration from environment
String host = System.getenv("DATABRICKS_HOST");
String clientId = System.getenv("DATABRICKS_CLIENT_ID");
String clientSecret = System.getenv("DATABRICKS_CLIENT_SECRET");
// Set partner and product identifiers
UserAgent.withProduct("<product-name>", "<product-version>");
UserAgent.withPartner("<isv-name>");
// Build configuration
DatabricksConfig config = new DatabricksConfig()
.setHost(host)
.setClientId(clientId)
.setClientSecret(clientSecret);
// Create workspace client
WorkspaceClient client = new WorkspaceClient(config);
// Example: List catalogs starting with 's'
for (CatalogInfo catalog : client.catalogs().list(new ListCatalogsRequest())) {
String name = catalog.getName();
if (name != null && name.toLowerCase().startsWith("s")) {
System.out.println("Catalog: " + name);
}
}
}
}
note
withPartner()identifies the ISV or integration ownerwithProduct()sets the product name and version (must be a valid SemVer version)
What's next
- Configure REST APIs: Set up User-Agent headers for direct API calls. See REST APIs.
- Configure Databricks Connect: Implement telemetry for Spark sessions. See Databricks Connect.
- Review User-Agent format: Understand the required format and guidelines. See Telemetry & Attribution.