In the the
,
Identity and Access Management (IAM) allows allows you to control user and group access to your project's resources. This section describes the IAM permissions relevant to
and the IAM roles that grant those permissions. To access the IAM console, see
https://cloud.google.com/iam.
- A A role is is a set of one or more permissions. A role is assigned to users and groups.
- A A permission grants grants access to a resource. Different permissions can grant different access levels to the same resource.
...
Required Roles and Their Permissions
To use
, the following roles are required. Below, you can review each required role, its purpose, and the permissions that are enabled by it.
Role | Use | Permissions and roles |
---|
roles/dataprep.user | Enables a user to run in a project See below. | Permissions: |
roles/dataprep.serviceAgent | Enables the platform to access and modify datasets and storage and to run and manage jobs on behalf of the user within the project Info |
---|
NOTE: When the product is enabled within a project, this role is granted by the project owner as part of the enablement process. For more information, see Enable or Disable Dataprep. |
| Permissions: storage.buckets.get storage.buckets.list
Roles: roles/dataflow.developer roles/bigquery.user roles/bigquery.dataEditor roles/storage.objectAdmin roles/iam.serviceAccountUser
|
roles/dataprep.user IAM Role
All users of any version of of
must must be assigned the the roles/dataprep.user
IAM IAM Role....
This role and its related permissions enable access to all data in a project. Other permissions do not apply. |
Permissions for
provides additional capabilities for project users. The base set of permissions and some additional permissions are required. Below, you can review the required permissions for this product edition.
...
- bigquery.datasets.get
- bigquery.jobs.create
- bigquery.tables.create
- bigquery.tables.get
- bigquery.tables.getData
- bigquery.tables.list
Run Run
on on :
- compute.machineTypes.get
- dataflow.jobs.create
- dataflow.jobs.get
- dataflow.messages.list
- dataflow.metrics.get
Read and write to to
, the base storage
for for :
- storage.buckets.get
- storage.buckets.list
- storage.objects.create
- storage.objects.delete
- storage.objects.get
- storage.objects.list
- storage.objects.update
...