Contents:
Dataflow Execution Settings
NOTE: Changes made to your execution settings set at the project level do not affect any overrides that have been previously applied at the individual job level. Job-level overrides remain as configured.
Tip: For more information on how the following settings affect your jobs, see Dataflow Execution Settings.
Setting | Description |
---|---|
Regional endpoint | A region is a specific geographical location where you can run your resources. |
Zone | A sub-section of region, a zone contains specific resources. Select |
Machine type | Choose the type of machine on which to run your job. The default is n1-standard-1 . |
Making changes to Region, Zone, or Machine Type can affect the time and cost of job executions. For more information, see https://cloud.google.com/dataflow/docs/concepts/regional-endpoints.
For more information on machine types, https://cloud.google.com/compute/docs/machine-types.
Advanced Settings
NOTE: Changes made to your advanced settings set at the project level do not affect any overrides that have been previously applied at the individual job level. Job-level overrides remain as configured.
Setting | Description |
---|---|
VPC network mode | Select the network mode to use. If the network mode is set to NOTE: Unless you have specific reasons to modify these settings, you should leave them as the default values. These network settings apply to job execution. Preview and sampling use the For
For more information: |
Network | To use a different VPC network, enter the name of the VPC network to use as an override for this job. Click Save to apply the override. |
Subnetwork | To specify a different subnetwork, enter the URL of the subnetwork. The URL should be in the following format: regions/<REGION>/subnetworks/<SUBNETWORK> where:
If you have access to another project within your organization, you can execute your Dataflow job through it by specifying a full URL in the following form: https://www.googleapis.com/compute/v1/projects/<HOST_PROJECT_ID>/regions/<REGION>/subnetworks/<SUBNETWORK> where:
Click Save to apply the override. |
Setting | Description |
---|---|
Worker IP address configuration | If the VPC Network mode is set to
|
Autoscaling algorithms | The type of algorithm to use to scale the number of Google Compute Engine instances to accommodate the size of your job. Possible values:
|
Initial number of workers | Number of Google Compute Engine instances with which to launch the job. This number may be adjusted as part of job execution. This number must be an integer between 1 and 1000 , inclusive. |
Maximum number of workers | Maximum number of Google Compute Engine instances to use during execution. This number must be an integer between 1 and |
Service account | Every Dataprep by Trifacta job executed in Dataflow requires that the job be submitted through a service account. By default, Dataprep by Trifacta uses a single Compute Engine service account under which jobs from all project users are run. Optionally, you can specify a different service account under which to run your jobs for the project. NOTE: When using a named service account to access data and run jobs in other projects, you must be granted the NOTE: Individual users can specify service accounts under which their jobs are run. If companion service accounts are enabled, each user must have a service account specified for use. For more information on service accounts, see Google Service Account Management. |
Labels | Create or assign labels to apply to the billing for the Dataprep by Trifacta jobs run in your project. You may reference up to 64 labels. NOTE: Each label must have a unique key name. For more information, see https://cloud.google.com/resource-manager/docs/creating-managing-labels. |
Notes on behavior:
- Values specified here are applied to all jobs executed within the project.
- If property values are not specified here, then the properties are not passed in with any job execution, and the default Dataprep by Trifacta property values are used.
- The property values specified here can be overridden by property values specified for individual jobs. For more information, see Dataflow Execution Settings.
This page has no comments.