Overwrite All Settings For A Job
Arguments
- job_id
The canonical identifier of the job.
- name
Name for the job.
- schedule
Instance of
cron_schedule()
.- tasks
Task specifications to be executed by this job. Use
job_tasks()
.- job_clusters
Named list of job cluster specifications (using
new_cluster()
) that can be shared and reused by tasks of this job. Libraries cannot be declared in a shared job cluster. You must declare dependent libraries in task settings.- email_notifications
Instance of
email_notifications()
.- timeout_seconds
An optional timeout applied to each run of this job. The default behavior is to have no timeout.
- max_concurrent_runs
Maximum allowed number of concurrent runs of the job. Set this value if you want to be able to execute multiple runs of the same job concurrently. This setting affects only new runs. This value cannot exceed 1000. Setting this value to 0 causes all new runs to be skipped. The default behavior is to allow only 1 concurrent run.
- access_control_list
Instance of
access_control_request()
.- git_source
Optional specification for a remote repository containing the notebooks used by this job's notebook tasks. Instance of
git_source()
.- host
Databricks workspace URL, defaults to calling
db_host()
.- token
Databricks workspace token, defaults to calling
db_token()
.- perform_request
If
TRUE
(default) the request is performed, ifFALSE
the httr2 request is returned without being performed.
See also
Other Jobs API:
db_jobs_create()
,
db_jobs_delete()
,
db_jobs_get()
,
db_jobs_list()
,
db_jobs_run_now()
,
db_jobs_runs_cancel()
,
db_jobs_runs_delete()
,
db_jobs_runs_export()
,
db_jobs_runs_get()
,
db_jobs_runs_get_output()
,
db_jobs_runs_list()
,
db_jobs_runs_submit()
,
db_jobs_update()