BigQuery is a fully managed, serverless data warehouse hosted in |
A project owner does not need to enable access to BigQuery. Access to BigQuery is governed by permissions.
An IAM role is used by the product to enable access for it to Google Cloud resources. The default IAM role that is assigned to each user in the project is granted access to read and write data in BigQuery.
If you are using a custom IAM role to access Google Cloud resources, you must ensure that the role contains the appropriate permissions to read and write data in BigQuery.
For more information, see Required Dataprep User Permissions.
A service account is used by the product to run jobs in . The default service account that is assigned to each user in the project is granted access to BigQuery.
If you are using a custom service account, you must ensure that the account contains the appropriate permissions to read and write data in BigQuery.
For more information, see Google Service Account Management.
NOTE: The platform supports a single, global connection to BigQuery. All users must use this connection. |
You do not need to create a connection to BigQuery. It is accessible based on permissions. See above.
Create via API
You cannot create BigQuery connections through the APIs.
NOTE: JSON data stored in BigQuery is not supported for ingest or publication. |
can use BigQuery for the following tasks:
NOTE: If you are reading data from BigQuery and writing job results for it back to BigQuery, both locations must be in the same geographic region. |
NOTE: |
For more information on how data types are converted to and from BigQuery sources, see BigQuery Data Type Conversions.
Use of Customer Managed Encryption Keys (CMEK) is supported and is transparent to the user. For more information, see https://cloud.google.com/kms/docs/cmek.
You can create a dataset from a table stored in BigQuery.
Views
NOTE: Standard SQL syntax is supported. Legacy SQL syntax is not supported. |
NOTE: Reading from external tables or from tables without a schema is not supported. |
NOTE: |
You can create your datasets by generating custom SQL SELECT statements on your BigQuery tables.
Tip: By pre-filtering your datasets using custom SQL, you can reduce the volume of data that is transferred from the database, which can significantly improve import performance. |
For more information, see Create Dataset with SQL
If you have access to multiple projects:
Enter a project identifier to browse for BigQuery databases and tables |
Enter the project identifier in the textbox, and click Go.
Tip: The identifiers for your projects are available from the Projects menu in the toolbar. See Projects Menu. |
Tip: You can paste project identifiers from publicly available projects to read from them. |
You can write datasets to BigQuery as part of your job definition.
NOTE: In some environments, users may not be permitted the |
NOTE: Object data types are written back to BigQuery as string values. Some Array data types are written as arrays. For more information, see BigQuery Data Type Conversions. |
NOTE: BigQuery does not support destinations with a dot (.) in the name. For more information, see https://cloud.google.com/bigquery/docs/tables#table_naming. |
Tip: Optionally, you can choose to merge your results to an existing BigQuery table. Based on a specified set of matching columns, you can define the rows to update from your source. |
NOTE: By default, data is published to BigQuery using strict data type matching rules. When there are mismatches between |
Tip: During import from |
For more information on creating a publishing action to BigQuery, see BigQuery Table Settings.
If you have write access to other projects, you can write to BigQuery tables that are associated with those projects. You must have write access to any table to which you are writing.
You can specify the target table as part of the job specification. See Run Job Page.
The following syntax requirements apply to this connection.
Object delimiters:
Example syntax:
SELECT column1,column2 FROM `databaseName.tableName` |
For more information, see https://cloud.google.com/bigquery/docs/reference/standard-sql/query-syntax.
For more information on SQL in general, see Supported SQL Syntax.
If your source data and output targets are located in BigQuery, you may be able to execute your recipe transformations inside BigQuery. For more information, see BigQuery Running Environment.
For more information, see BigQuery Data Type Conversions.
Enable: Automatically enabled.
Create New Connection: n/a