O2D sharing
Open-to-Databricks (O2D) sharing enables you to consume data shared by external providers into your Databricks environment. This is the inverse of the D2O pattern—instead of sharing data out, you're bringing shared data in. For step-by-step instructions, see Access data shared with you.
- Consume Delta Shares from external providers
- Authenticate using bearer tokens or OpenID Connect (OIDC) federation
- Data lands in your Unity Catalog as external tables
- Full governance and lineage tracking once data is in your environment
Consuming external shares
Credential files
External providers will supply a credential file (.share file) containing the connection details and authentication tokens. Import this file to create a connection to the external share.
- Treat
.sharefiles as secrets—store securely and do not commit to version control
Authentication
Two authentication methods are supported:
- Bearer tokens: Provider issues a token with a configured lifetime
- OpenID Connect (OIDC) federation: Authenticate using your existing identity provider
Automate token refresh for OIDC flows to avoid access interruptions.
Best practices
Validate data on ingestion
- Check schema compatibility before building downstream pipelines
- Monitor for schema changes from the provider
- Set up alerts for access failures or token expiration
Govern shared data
Once external data lands in your Unity Catalog:
- Apply appropriate tags and classifications
- Set up access controls for downstream consumers
- Track lineage from external source through your pipelines
Monitor usage and costs
- Track data transfer volumes from external providers
- Monitor query patterns on shared data
- Review egress costs if data is replicated or transformed
What's next
- Learn about D2D sharing patterns for Databricks-to-Databricks
- Learn about D2O sharing patterns for outbound sharing
- Explore bi-directional sharing for two-way data collaboration
- Set up monitoring to track sharing activity