Repair a job run.
repair_job_run_and_wait.RdThis is a long-running operation, which blocks until Jobs on Databricks reach
TERMINATED or SKIPPED state with the timeout of 20 minutes, that you can change via timeout parameter.
By default, the state of Databricks Jobs is reported to console. You can change this behavior
by changing the callback parameter.
Usage
repair_job_run_and_wait(
  client,
  run_id,
  dbt_commands = NULL,
  jar_params = NULL,
  job_parameters = NULL,
  latest_repair_id = NULL,
  notebook_params = NULL,
  pipeline_params = NULL,
  python_named_params = NULL,
  python_params = NULL,
  rerun_all_failed_tasks = NULL,
  rerun_dependent_tasks = NULL,
  rerun_tasks = NULL,
  spark_submit_params = NULL,
  sql_params = NULL,
  timeout = 20,
  callback = cli_reporter
)Arguments
- client
- Required. Instance of DatabricksClient() 
- run_id
- Required. The job run ID of the run to repair. 
- dbt_commands
- An array of commands to execute for jobs with the dbt task, for example - 'dbt_commands': ['dbt deps', 'dbt seed', 'dbt run'].
- jar_params
- A list of parameters for jobs with Spark JAR tasks, for example - 'jar_params': ['john doe', '35'].
- job_parameters
- Job-level parameters used in the run. 
- latest_repair_id
- The ID of the latest repair. 
- notebook_params
- A map from keys to values for jobs with notebook task, for example - 'notebook_params': {'name': 'john doe', 'age': '35'}.
- pipeline_params
- This field has no description yet. 
- python_named_params
- A map from keys to values for jobs with Python wheel task, for example - 'python_named_params': {'name': 'task', 'data': 'dbfs:/path/to/data.json'}.
- python_params
- A list of parameters for jobs with Python tasks, for example - 'python_params': ['john doe', '35'].
- rerun_all_failed_tasks
- If true, repair all failed tasks. 
- rerun_dependent_tasks
- If true, repair all tasks that depend on the tasks in - rerun_tasks, even if they were previously successful.
- rerun_tasks
- The task keys of the task runs to repair. 
- spark_submit_params
- A list of parameters for jobs with spark submit task, for example - 'spark_submit_params': ['--class', 'org.apache.spark.examples.SparkPi'].
- sql_params
- A map from keys to values for jobs with SQL task, for example - 'sql_params': {'name': 'john doe', 'age': '35'}.
- timeout
- Time to wait for the operation to complete in minutes. 
- callback
- Function to report the status of the operation. By default, it reports to console.