Delta Sharing
A share is a container that holds data assets you can make available to external users, either through the Databricks Marketplace or via direct share. Within a share, you can include Unity Catalog objects such as tables, views, volumes, notebooks, and models. Each type of asset plays a different role in helping your consumers access and utilize your data. For a comprehensive overview of the Delta Sharing protocol, see What is Delta Sharing?
The following asset types are only available in D2D (Databricks-to-Databricks) sharing: Volumes, Notebooks, Models, and MCP.
Creating a share
Create a share and add assets to it:
-- Create a share
CREATE SHARE sales_data
COMMENT 'Sales data for analytics partners';
-- Add a table
ALTER SHARE sales_data ADD TABLE catalog.schema.sales;
-- Add a table with history (enables time travel and CDF)
ALTER SHARE sales_data ADD TABLE catalog.schema.orders WITH HISTORY;
For full syntax, see CREATE SHARE.
Tables
Tables are the most common type of shared asset. They contain structured data that consumers can query directly using SQL, Python, or other supported languages.
When you share tables, you can choose to include table history, which allows consumers to use time travel and incremental updates. If your data changes regularly, you can also enable Change Data Feed (CDF) so consumers can track and process only what has changed. For details, see Add tables to a share and Change Data Feed.
Views
Views allow you to deliver different subsets of data to different customers without creating separate tables or shares for each one. By using the current_recipient() function inside your SQL logic, you can dynamically filter data based on the recipient's identity:
CREATE VIEW acme.marketplace.orders_view AS
SELECT * FROM acme.internal.orders
WHERE region = current_recipient('region');
This approach supports fine-grained access control, reduces duplication, and simplifies maintenance since you only need to manage a single dataset. See Dynamic Views & Data Filtering for detailed patterns. For full syntax and more examples, see current_recipient function.
Volumes
Volumes are useful when you need to share unstructured or semi-structured content. A volume can contain almost any file type, including:
- Images, videos, and audio files
- Documents and PDFs
- Private code libraries that can be imported into consumer notebooks
- Logs and JSON files
This flexibility gives data providers a way to share richer content, such as AI training data, multimedia archives, or custom tools, alongside their structured datasets. Volumes are particularly valuable for AI use cases—you can include training datasets, reference files, data dictionaries, and sample code that helps consumers integrate your data into their ML pipelines. See AI readiness for more on packaging AI-ready data products.
Recipients have read-only access to shared volumes. For details, see Add volumes to a share.
Notebooks
Notebooks can be shared to help consumers get value from your data more quickly. You can include examples written in SQL, Python, Scala, or R that show how to query or transform the data. Notebooks can also demonstrate common use cases, analytic workflows, or integration patterns that your consumers can adapt for their own environments.
Sharing notebooks helps shorten onboarding time and improves adoption by showing users exactly how to work with your data. See Listings for best practices on sample notebooks.
Models
Model sharing in Databricks lets organizations discover, share, deploy, and serve machine learning and AI models securely and efficiently across teams, clouds, platforms, and regions.
Delta Sharing supports native model sharing in Unity Catalog, so models can be made available to trusted consumers without manual file transfers or replicated storage. Consumers can then load, evaluate, and deploy these shared models directly in their own environments for inference, fine-tuning, production serving, or experimentation without needing to rebuild or retrain the model from scratch.
Model Context Protocol (MCP)
MCP is an open-source standard that enables AI agents to interact with tools, APIs, data sources, and workflows in a standardized way. The main benefit is portability—you can create a tool once and use it with any agent.
In the Databricks Marketplace, MCP products are connection objects that Agent Bricks and other AI agents can install and connect to for execution, tool-calling, and contextual enrichment. This capability is critical for building agentic workflows that combine models, data, and tools into cohesive, production-ready solutions.
For details, see Model Context Protocol (MCP) on Databricks.
What's next
- Learn about D2D sharing patterns for structured and unstructured data
- Set up recipients for your shares
- Create high-quality listings for your data products