...
Info | |
---|---|
NOTE: BigQuery is not a running environment that you explicitly select or specify as part of a job. If all of the requirements are met, then the job is executed in BigQuery when you select
|
- All datasources and all outputs specified in a job For any job to be executed in BigQuery, all datasources must be located in BigQuery or
, and all outputs must be located within BigQuery.D s storage
must be selected as running environment.D s dataflow All recipe steps, including all
functions in the recipe, must be translatable to SQL.D s lang Info NOTE: When attempting to execute a job in BigQuery,
executes each recipe in BigQuery, until it reaches a step that cannot be executed there. At that point, data is transferred toD s webapp
, where the remainder of the job is executed.D s dataflow - BigQuery imposes a limit of 1 MB for all submitted SQL queries. If this limit is exceeded during job execution,
falls back to submitting the job throughD s product
.D s dataflow - Some transformations and functions are not currently supported for execution in BigQuery. See below.Upserts, merges, and deletes are not supported for full execution in BigQuery.
- Sampling jobs are not supported for execution in BigQuery.
- If your recipe includes data quality rules, the job cannot be fully executed in BigQuery.
- BigQuery does not permit partitioned tables to be replaced. As a result, the Drop and Load publishing action is not supported when writing to a partitioned table during BigQuery execution. For more information, see https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language#create_table_statement.
- In BigQuery, escaped whitespace characters (
\s
) match a broader set of Unicode space characters than
, due to differences in implementation of regular expressions between the two running environments. Depending on your dataset, this difference may result in mismatches between rows in your results when running the same job across different running environments.D s dataflow
...