Dataflow Execution Settings
NOTE: Changes made to your execution settings set at the project level do not affect any overrides that have been previously applied at the individual job level. Job-level overrides remain as configured.
Tip: For more information on how the following settings affect your jobs, see Dataflow Execution Settings.
|A region is a specific geographical location where you can run your resources.|
A sub-section of region, a zone contains specific resources.
|Choose the type of machine on which to run your job. The default is |
Making changes to Region, Zone, or Machine Type can affect the time and cost of job executions. For more information, see https://cloud.google.com/dataflow/docs/concepts/regional-endpoints.
For more information on machine types, https://cloud.google.com/compute/docs/machine-types.
NOTE: Changes made to your advanced settings set at the project level do not affect any overrides that have been previously applied at the individual job level. Job-level overrides remain as configured.
VPC network mode
Select the network mode to use.
If the network mode is set to
NOTE: Unless you have specific reasons to modify these settings, you should leave them as the default values. These network settings apply to job execution. Preview and sampling use the
For more information:
|To use a different VPC network, enter the name of the VPC network to use as an override for this job. Click Save to apply the override.|
To specify a different subnetwork, enter the URL of the subnetwork. The URL should be in the following format:
If you have access to another project within your organization, you can execute your Dataflow job through it by specifying a full URL in the following form:
Click Save to apply the override.
Worker IP address configuration
If the VPC Network mode is set to
The type of algorithm to use to scale the number of Google Compute Engine instances to accommodate the size of your job. Possible values:
Initial number of workers
|Number of Google Compute Engine instances with which to launch the job. This number may be adjusted as part of job execution. This number must be an integer between 1 and |
Maximum number of workers
Maximum number of Google Compute Engine instances to use during execution. This number must be an integer between 1 and
Every Dataprep by Trifacta job executed in Dataflow requires that the job be submitted through a service account. By default, Dataprep by Trifacta uses a single Compute Engine service account under which jobs from all project users are run.
Optionally, you can specify a different service account under which to run your jobs for the project.
NOTE: When using a named service account to access data and run jobs in other projects, you must be granted the
NOTE: Individual users can specify service accounts under which their jobs are run. If companion service accounts are enabled, each user must have a service account specified for use.
For more information on service accounts, see Google Service Account Management.
Create or assign labels to apply to the billing for the Dataprep by Trifacta jobs run in your project. You may reference up to 64 labels.
NOTE: Each label must have a unique key name.
For more information, see https://cloud.google.com/resource-manager/docs/creating-managing-labels.
Notes on behavior:
- Values specified here are applied to all jobs executed within the project.
- If property values are not specified here, then the properties are not passed in with any job execution, and the default Dataprep by Trifacta property values are used.
- The property values specified here can be overridden by property values specified for individual jobs. For more information, see Dataflow Execution Settings.
This page has no comments.