...
Info |
---|
NOTE: If the job is executed in an environment other than , the job is queued for execution in the environment. Jobs executed on a remote cluster may incur additional overhead to spin up execution nodes, which is typically within 10-15 seconds. During job execution, observes the job in progress and reports progress as needed back into the application. does not control the execution of the job. |
Tip |
---|
Tip: Jobs can be scheduled for periodic execution through Flow View page. For more information, see Add Schedule Dialog. |
Tip |
---|
Tip: Columns that have been hidden in the Transformer page still appear in the generated output. Before you run a job, you should verify that all currently hidden columns are ok to include in the output. |

...
Spark: Executes the job using the Spark running environment.
Dataflow: Executes job on
within the
. This environment is best suited for larger jobs.
...
Tip |
---|
Tip: If this limit is exceeded, the job may fail with a job graph too large error. The workaround is to split the job into smaller jobs, such as splitting the recipe into multiple recipes. This is a known limitation of . |
...
Automation
Run jobs via API
You can use the available REST APIs to execute jobs for known datasets. For more information, see API Reference.
D s also |
---|
inCQL | true |
---|
label | ((label = "job_ui") OR (label = "job")) |
---|
|