Transpile Guide
Verify Installation
Verify the successful installation by executing the provided command; confirmation of a successful installation is indicated when the displayed output aligns with the example screenshot provided:
databricks labs lakebridge transpile --help

Execution Pre-Set Up
When you run install-transpile
, you will be prompted to enter all the required elements to transpile your code. You can
choose to configure this at the time of installation, or choose defaults/set later and simply pass these flags on your
transpile call.
The transpile
command will trigger the conversion of the specified code, these are the available arguments:
source-dialect [Required]
- Dialect name (ex: snowflake, oracle, datastage, etc)input-source [Required]
- The path to the SQL file or directory containing SQL files to be transpiled.output-folder [Optional]
- The path to the output folder where the transpiled SQL files will be stored. If not specified, the transpiled SQL files will be stored in a folder calledtranspiled
in your current working directory.error-file-path [Optional]
- The path to the file where the transpile errors will be stored. If not specified, the errors will be stored in a file callederrors.log
in your current working directory.skip-validation [Optional]
- The default value is True. If set to False, the transpiler will validate the transpiled SQL scripts against the Databricks catalog and schema provided by user.catalog-name [Optional]
- The name of the catalog in Databricks. If not specified, the default catalogtranspiler_test
will be used.schema-name [Optional]
- The name of the schema in Databricks. If not specified, the default schemaconverter_test
will be used.
Execution
Execute the below command to initialize the transpile process passing the arguments to the command directly in the call.
databricks labs lakebridge transpile --transpiler-config-path <absolute-path> --input-source <absolute-path> --source-dialect <snowflake> --output-folder <absolute-path> --skip-validation <True|False> --catalog-name <catalog name> --schema-name <schema name>

If you have configured all the required inputs at installation time, you can simply run
databricks labs lakebridge transpile