Page tree

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version r100



NOTE: If the job is executed in an environment other than

D s photon
, the job is queued for execution in the environment. Jobs executed on a remote cluster may incur additional overhead to spin up execution nodes, which is typically within 10-15 seconds. During job execution,
D s product
observes the job in progress and reports progress as needed back into the application.
D s product
does not control the execution of the job.


Tip: Jobs can be scheduled for periodic execution through Flow View page. For more information, see Add Schedule Dialog.


Tip: Columns that have been hidden in the Transformer page still appear in the generated output. Before you run a job, you should verify that all currently hidden columns are ok to include in the output.

D caption
Run Job Page


Spark: Executes the job using the Spark running environment.

Dataflow: Executes job on

D s dataflow
within the
D s gcp platform
. This environment is best suited for larger jobs.



Tip: If this limit is exceeded, the job may fail with a job graph too large error. The workaround is to split the job into smaller jobs, such as splitting the recipe into multiple recipes. This is a known limitation of

D s dataflow



Run jobs via API

You can use the available REST APIs to execute jobs for known datasets. For more information, see API Reference.

D s also
label((label = "job_ui") OR (label = "job"))