, Identity and Access Management (IAM)
allows you to control user and group access to your project's resources. This section describes the IAM permissions relevant to
and the IAM roles that grant those permissions. To access the IAM console, see https://cloud.google.com/iam
- A role is a set of one or more permissions. A role is assigned to users and groups.
- A permission grants access to a resource. Different permissions can grant different access levels to the same resource.
For more information on the service accounts used by
to manage security and permissions while running
jobs, see https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#security_and_permissions_for_pipelines_on_google_cloud_platform
Tools for manage IAM policies:
- Google Cloud Console
- gcloud CLI
For more information, see https://cloud.google.com/iam/docs/granting-changing-revoking-access.
Required Roles and Their Permissions
, the following roles are required. Below, you can review each required role, its purpose, and the permissions that are enabled by it.
|Role||Use||Permissions and roles|
Enables a user to run
in a project See below.
Enables the platform to access and modify datasets and storage and to run and manage
jobs on behalf of the user within the project
NOTE: When the product is enabled within a project, this role is granted by the project owner as part of the enablement process. For more information, see Enable or Disable Dataprep.
roles/dataprep.user IAM Role
All users of any version of must be assigned the
roles/dataprep.user IAM Role.
This role and its related permissions enable access to all data in a project. Other permissions do not apply.
Required Permissions for
The following base set of IAM permissions and some additional permissions are required for accessing the product. Below, you can review the required permissions for this product edition.
NOTE: These permissions provide basic access to the . Additional features within the product or available through external integrations are considered optional.
List available machine types for jobs
Read and write to
, the base storage for
|storage.buckets.list||Required at project level|
|storage.buckets.get||Get bucket metadata||Required for staging bucket only|
|storage.objects.create||Create files||Required for staging bucket only|
|storage.objects.delete||Delete files||Required for staging bucket only|
|storage.objects.get||Read files||Required for staging bucket only|
|storage.objects.list||List files||Required for staging bucket only|
Permissions required to integrate with external services are considered optional.
Read and write to BigQuery, including views and custom SQL:
For Custom SQL query support and launching jobs with BigQuery data sources.
|Required at project level to use BigQuery|
|bigquery.datasets.get||List and get metadata about datasets in project||Can be applied at project level or at individual dataset level|
|bigquery.tables.create||Execute custom queries||Can be applied at project level or at individual dataset level|
|bigquery.tables.get||Create tables in dataset||Can be applied at project level or at individual dataset level|
|bigquery.tables.get||Get table metadata||Can be applied at project level or at individual dataset level|
|bigquery.tables.getData||get table contents||Can be applied at project level or at individual dataset level|
|bigquery.tables.list||List tables in dataset||Can be applied at project level or at individual dataset level|
Additional permissions may be required to use specific features. Individual users may be required to permit
access when the feature is first used.
|dataflow.jobs.cancel||Enables users to cancel their jobs in progress. It is not required for the product to work but may be helpful to add via IAM roles.|
BigQuery publishing options
The following permissions are required to publish to BigQuery:
|bigquery.datasets.create||Create datasets in BigQuery|
|bigquery.datasets.update||Update datasets in BigQuery|
The following permission is not required to publish to BigQuery.
If this permission is not granted to a user, that user requires one of the following permissions to drop or truncate table data in BigQuery:
- The user is granted
owner role on the project.
- The user is granted bigquery.tables.delete for the project.
NOTE: If a user does not have this permission when publishing to a table, the user receives a warning that the target dataset is read-only.
Google Sheets access
|D s ed|
For more information, see Import Google Sheets Data.
Additional Permissions for Cloud IAM
To run jobs on
, one of the following must be applied:
- User must have
iam.serviceAccounts.actAs permission on a compute service account, which must be specified during job execution.
User must have
iam.serviceAccounts.actAs permission specified at the project level or in the default compute service account:
Project owners require no additional permissions on the projects that they own.
For more information, see https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#security_and_permissions_for_pipelines_on_google_cloud_platform.
In addition to the IAM roles above, users must also be granted the following to enable data access based on their Cloud IAM:
These permissions ensure that users can access the appropriate data within