Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version r0822

...

When enabled, users can override Spark configuration options for output objects before running Spark jobs.

Tip

Tip: When enabled, a default set of Spark configuration options is available for users. Additional properties can be specified through the Spark Whitelist Properties setting.

See Enable Spark Job Overrides.

Databricks Cluster Policies

When enabled, this feature allows the

D s platform
to leverage cluster policies that you have created for use when creating new Databricks clusters for job execution.

Info

NOTE: You must create cluster policies before enabling this feature. Each user may select a cluster policy to use. Additional configuration and considerations may be required. If no policy is selected, jobs may still be executed.

For more information:

Databricks Job Management

Enables job execution on Databricks through a secondary method. When enabled, Databricks jobs are executed via the run/submit API endpoint, which avoids the job quota limitation imposed by Databricks clusters. This flag also enables deletion of Databricks jobs from the Databricks workspace.

...