Skip to main content

DQX Engine

To perform data quality checking with DQX, you must create a DQEngine object. The engine requires a Databricks workspace client for authentication and interaction with the Databricks workspace.

When running the code on a Databricks workspace, the workspace client is automatically authenticated, whether DQX is used in a notebook, script, or job/workflow. You only need the following code to create the workspace client if you run DQX on Databricks workspace:

from databricks.sdk import WorkspaceClient
from databricks.labs.dqx.engine import DQEngine

ws = WorkspaceClient()
dq_engine = DQEngine(ws)

For external environments, such as CI servers or local machines, you can authenticate to Databricks using any method supported by the Databricks SDK. For detailed instructions, refer to the default authentication flow. If you're using Databricks configuration profiles or Databricks-specific environment variables for authentication, you can create the workspace client without needing to provide additional arguments:

ws = WorkspaceClient()

The DQEngine is initialized by default with standard Spark Session available in the current environment. If you need to use a custom Spark Session, such as from Databricks Connect you can pass it as an argument when creating the DQEngine instance:

from databricks.connect import DatabricksSession

spark = DatabricksSession.builder.getOrCreate()
dq_engine = DQEngine(ws, spark)

For local execution without a Databricks workspace, please refer to the local testing section.

DQX engine methods

The following table outlines the available methods of the DQEngine and their functionalities:

Available DQX engine methods
MethodDescriptionArgumentsSupports local execution
apply_checksApplies quality checks to the DataFrame and returns a DataFrame with reporting columns.df: DataFrame to check; checks: List of checks defined using DQX classes, each check is an instance of the DQRule class; ref_dfs: Reference dataframes to use in the checks, if applicable.Yes
apply_checks_and_splitApplies quality checks to the DataFrame and returns valid and invalid (quarantine) DataFrames with reporting columns.df: DataFrame to check; checks: List of checks defined using DQX classes, each check is an instance of the DQRule class; ref_dfs: Reference dataframes to use in the checks, if applicable.Yes
apply_checks_and_save_in_tableApplies quality checks using DQRule objects and writes results to valid and invalid Delta table(s) with reporting columns.input_config: InputConfig object with the table name and options for reading the input data; checks: List of checks defined using DQX classes, each check is an instance of the DQRule class; output_config: OutputConfig object with the table name, output mode, and options for the output data; quarantine_config: OutputConfig object with the table name, output mode, and options for the quarantine data - if provided, data will be split; ref_dfs: Reference dataframes to use in the checks, if applicable.No
apply_checks_by_metadataApplies quality checks defined as a dictionary to the DataFrame and returns a DataFrame with reporting columns.df: DataFrame to check; checks: List of checks defined as dictionary; custom_check_functions: (optional) dictionary with custom check functions (e.g., globals() of the calling module); ref_dfs: Reference dataframes to use in the checks, if applicable.Yes
apply_checks_by_metadata_and_splitApplies quality checks defined as a dictionary and returns valid and invalid (quarantine) DataFrames.df: DataFrame to check; checks: List of checks defined as dictionary; custom_check_functions: (optional) dictionary with custom check functions (e.g., globals() of the calling module); ref_dfs: Reference dataframes to use in the checks, if applicable.Yes
apply_checks_by_metadata_and_save_in_tableApplies quality checks defined as a dictionary and writes results to valid and invalid Delta table(s) with reporting columns.input_config: InputConfig object with the table name and options for reading the input data; checks: List of checks defined as dictionary; output_config: OutputConfig object with the table name, output mode, and options for the output data; quarantine_config: OutputConfig object with the table name, output mode, and options for the quarantine data - if provided, data will be split; custom_check_functions: (optional) dictionary with custom check functions; ref_dfs: Reference dataframes to use in the checks, if applicable.No
validate_checksValidates the provided quality checks to ensure they conform to the expected structure and types.checks: List of checks to validate; custom_check_functions: (optional) dictionary of custom check functions that can be used; validate_custom_check_functions: (optional) if set to True, validates custom check functions (defaults to True).Yes
get_invalidRetrieves records from the DataFrame that violate data quality checks (records with warnings and errors).df: Input DataFrame.Yes
get_validRetrieves records from the DataFrame that pass all data quality checks.df: Input DataFrame.Yes
load_checksLoads quality rules (checks) from storage. Multiple storage types are supported including tables, files or workspace files, installation-specific sources where the location is inferred automatically from run configconfig: Storage configuration for loading checks from a storage, i.e. FileChecksStorageConfig: file in a local filesystem (YAML or JSON); WorkspaceFileChecksStorageConfig: file in a workspace (YAML or JSON); VolumeFileChecksStorageConfig: file in a Unity Catalog Volume (YAML or JSON); TableChecksStorageConfig: a table; InstallationChecksStorageConfig: storage defined in the installation context, using the checks_location field from the run configuration. See more details below.Yes (only with FileChecksStorageConfig)
save_checksSaves quality rules (checks) to storage. Multiple storage types are supported including tables, files or workspace files, installation-specific targets where the location is inferred automatically from run configchecks: List of checks defined as dictionary; config: Storage configuration for saving checks in a storage, i.e. FileChecksStorageConfig: file in a local filesystem (YAML or JSON); WorkspaceFileChecksStorageConfig: file in a workspace (YAML or JSON); VolumeFileChecksStorageConfig: file in a Unity Catalog Volume (YAML or JSON); TableChecksStorageConfig: a table; InstallationChecksStorageConfig: storage defined in the installation context, using the checks_location field from the run configuration. See more details below.Yes (only with FileChecksStorageConfig)
save_results_in_tableSave quality checking results in delta table(s).output_df: (optional) Dataframe containing the output data; quarantine_df: (optional) Dataframe containing the output data; output_config: OutputConfig object with the table name, output mode, and options for the output data; quarantine_config: OutputConfig object with the table name, output mode, and options for the quarantine data - if provided, data will be split; run_config_name: Name of the run config to use; assume_user: If True, assume user installation.No

The 'Supports local execution' in the above table indicates which methods can be used for local testing without a Databricks workspace (see the usage in local testing section).

InputConfig support the following parameters:

  • location: The location of the input data source (e.g. table name or file path).
  • format: The format of the input data (default is delta).
  • is_streaming: Whether the input data is a streaming source (default is False).
  • schema: Optional schema for the input data.
  • options: Additional options for reading the input data, such as partitioning or merge settings.

OutputConfig support the following parameters:

  • location: The location of the output data (e.g. table name).
  • format: The format of the output data (default is delta).
  • mode: The write mode for the output data (overwrite or append, default is append).
  • options: Additional options for writing the output data, such as schema merge settings.
  • trigger: Optional trigger settings for streaming output, such as trigger={"processingTime": "10 seconds"}.

Supported storage configurations (implementations of BaseChecksStorageConfig) for load_checks and save_checks methods:

  • FileChecksStorageConfig can be used to save or load checks from the local filesystem, with fields:
    • location: file path in the local filesystem (JSON or YAML)
  • WorkspaceFileChecksStorageConfig can be used to save or load checks from a workspace file, with fields:
    • location: workspace file path (JSON or YAML)
  • TableChecksStorageConfig can be used to save or load checks from a table, with fields:
    • location: table fully qualified name
    • run_config_name: (optional) run configuration name to load (use "default" if not provided)
    • mode: (optional) write mode for saving checks (overwrite or append, default is overwrite). The overwrite mode will only replace checks for the specific run config and not all checks in the table.
  • VolumeFileChecksStorageConfig can be used to save or load checks from a Unity Catalog Volume file, with fields:
    • location: Unity Catalog Volume file path (JSON or YAML)
  • InstallationChecksStorageConfig can be used to save or load checks from workspace installation, with fields:
    • location (optional): automatically set based on the checks_location field from the run configuration.
    • run_config_name (optional) - run configuration name to load (use "default" if not provided).
    • product_name: name of the product/installation directory
    • assume_user: if True, assume user installation