Skip to main content

Transpile Guide

Verify Installation

Verify the successful installation by executing the provided command; confirmation of a successful installation is indicated when the displayed output aligns with the example screenshot provided:

 databricks labs lakebridge transpile --help
transpile-help

Execution Pre-Set Up

When you run install-transpile, you will be prompted to enter all the required elements to transpile your code. You can choose to configure this at the time of installation, or choose defaults/set later and simply pass these flags on your transpile call.

The transpile command will trigger the conversion of the specified code, these are the available arguments:

  • source-dialect [Required] - Dialect name (ex: snowflake, oracle, datastage, etc)
  • input-source [Required] - The path to the SQL file or directory containing SQL files to be transpiled.
  • output-folder [Optional] - The path to the output folder where the transpiled SQL files will be stored. If not specified, the transpiled SQL files will be stored in a folder called transpiled in your current working directory.
  • error-file-path [Optional] - The path to the file where the transpile errors will be stored. If not specified, the errors will be stored in a file called errors.log in your current working directory.
  • skip-validation [Optional] - The default value is True. If set to False, the transpiler will validate the transpiled SQL scripts against the Databricks catalog and schema provided by user.
  • catalog-name [Optional] - The name of the catalog in Databricks. If not specified, the default catalog transpiler_test will be used.
  • schema-name [Optional] - The name of the schema in Databricks. If not specified, the default schema converter_test will be used.

Execution

Execute the below command to initialize the transpile process passing the arguments to the command directly in the call.

 databricks labs lakebridge transpile --transpiler-config-path <absolute-path> --input-source <absolute-path> --source-dialect <snowflake> --output-folder <absolute-path> --skip-validation <True|False> --catalog-name <catalog name> --schema-name <schema name>
transpile-run

If you have configured all the required inputs at installation time, you can simply run

 databricks labs lakebridge transpile