Databricks Connect
With Databricks Connect for Python and Scala, partners must set the userAgent as part of the connection parameters using the session builder.
Accelerate your build with the Partner AI Dev-Kit
AI coding assistant skills that generate PWAF-compliant Databricks connectors.
Learn more · Skills on GitHub
Scala
import com.databricks.connect.DatabricksSession
import org.apache.spark.sql.SparkSession
object ScalaDbConnectTest {
def main(args: Array[String]): Unit = {
val spark: SparkSession =
DatabricksSession.builder
.userAgent("<isv-name_product-name>/<product-version>")
.getOrCreate()
println(s"Spark Version: ${spark.version}")
spark.sql("SELECT current_user(), current_timestamp()").show(false)
spark.stop()
}
}
Python
from databricks.connect import DatabricksSession
from databricks.sdk.core import Config
def main():
# Build SDK config from environment
config = Config()
user_agent = "<isv-name_product-name>/<product-version>"
print("User-Agent:", user_agent)
# Create Spark session with Databricks Connect
spark = (
DatabricksSession.builder
.sdkConfig(config)
.userAgent(user_agent)
.getOrCreate()
)
print(f"Spark version: {spark.version}")
# Verify current user/timestamp
verify_df = spark.sql("SELECT current_user() AS user, current_timestamp() AS ts")
verify_df.show(truncate=False)
# DataFrame test
df = spark.range(5)
print("Result:", df.collect())
print("Done.")
if __name__ == "__main__":
main()
What's next
- Configure libraries: Implement telemetry for Python and Scala libraries. See Libraries & UDFs.
- Configure Lakebase: Set up telemetry for PostgreSQL-compatible clients. See Lakebase Integrations.
- Review User-Agent format: Understand the required format and guidelines. See Telemetry & Attribution.
- Generate connector code with AI: An open-source toolkit produces PWAF-compliant connectors with authentication and User-Agent telemetry preconfigured. It works with Claude Code, Cursor, Codex, and any MCP-compatible AI coding assistant. See Partner AI Dev-Kit.