Page tree


Contents:

Our documentation site is moving!

For up-to-date documentation of Designer Cloud on AWS, please visit us at https://help.alteryx.com/AWS/.

   

When you specify a job in the Run Job page, you may pass to the Spark running environment a set of Spark property values to apply to the execution of the job. These property values override the global Spark settings for your deployment.



Spark overrides are applied to individual output objects. 

  • You can specify overrides for ad-hoc jobs through the Run Job page. 
  • You can specify overrides when you configure a scheduled job execution.

In the Run Job page, click the Advanced Execution Settings caret.


This page has no comments.