Skip to contents

Updates a pipeline with the supplied configuration.

Usage

update_pipeline(
  client,
  pipeline_id,
  allow_duplicate_names = NULL,
  catalog = NULL,
  channel = NULL,
  clusters = NULL,
  configuration = NULL,
  continuous = NULL,
  development = NULL,
  edition = NULL,
  expected_last_modified = NULL,
  filters = NULL,
  id = NULL,
  libraries = NULL,
  name = NULL,
  notifications = NULL,
  photon = NULL,
  serverless = NULL,
  storage = NULL,
  target = NULL,
  trigger = NULL
)

pipelinesUpdate(
  client,
  pipeline_id,
  allow_duplicate_names = NULL,
  catalog = NULL,
  channel = NULL,
  clusters = NULL,
  configuration = NULL,
  continuous = NULL,
  development = NULL,
  edition = NULL,
  expected_last_modified = NULL,
  filters = NULL,
  id = NULL,
  libraries = NULL,
  name = NULL,
  notifications = NULL,
  photon = NULL,
  serverless = NULL,
  storage = NULL,
  target = NULL,
  trigger = NULL
)

Arguments

client

Required. Instance of DatabricksClient()

pipeline_id

Unique identifier for this pipeline.

allow_duplicate_names

If false, deployment will fail if name has changed and conflicts the name of another pipeline.

catalog

A catalog in Unity Catalog to publish data from this pipeline to.

channel

DLT Release Channel that specifies which version to use.

clusters

Cluster settings for this pipeline deployment.

configuration

String-String configuration for this pipeline execution.

continuous

Whether the pipeline is continuous or triggered.

development

Whether the pipeline is in Development mode.

edition

Pipeline product edition.

expected_last_modified

If present, the last-modified time of the pipeline settings before the edit.

filters

Filters on which Pipeline packages to include in the deployed graph.

id

Unique identifier for this pipeline.

libraries

Libraries or code needed by this deployment.

name

Friendly identifier for this pipeline.

notifications

List of notification settings for this pipeline.

photon

Whether Photon is enabled for this pipeline.

serverless

Whether serverless compute is enabled for this pipeline.

storage

DBFS root directory for storing checkpoints and tables.

target

Target schema (database) to add tables in this pipeline to.

trigger

Which pipeline trigger to use.