Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version r093

...

  • For any job to be executed in BigQuery, all datasources must be located in BigQuery or or 
    D s storage
    , and all outputs must be located within BigQuery.
  • D s dataflow
     must be selected as running environment.
  • All recipe steps, including all 

    D s lang
     functions in the recipe, must be translatable to SQL. 

    Info

    NOTE: When attempting to execute a job in BigQuery,

    D s webapp
    executes each recipe in BigQuery, until it reaches a step that cannot be executed there. At that point, data is transferred to
    D s dataflow
    , where the remainder of the job is executed.

  • BigQuery imposes a limit of 1 MB for all submitted SQL queries. If this limit is exceeded during job execution, 

    D s product
     falls back to submitting the job through 
    D s dataflow
    .

  • If the schemas have changed for your datasets, pushdown execution on BigQuery is not supported. 

    D s product
     falls back to submitting the job through 
    D s dataflow
    .

  • Some transformations and functions are not currently supported for execution in BigQuery. See below.
  • Upserts, merges, and deletes are not supported for full execution in BigQuery.
  • Sampling jobs are not supported for execution in BigQuery.
  • If your recipe includes data quality rules, the job cannot be fully executed in BigQuery.
  • BigQuery does not permit partitioned tables to be replaced. As a result, the Drop and Load publishing action is not supported when writing to a partitioned table during BigQuery execution. For more information, see https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language#create_table_statement. 
  • In BigQuery, escaped whitespace characters (\s) match a broader set of Unicode space characters than 
    D s dataflow
    , due to differences in implementation of regular expressions between the two running environments. Depending on your dataset, this difference may result in mismatches between rows in your results when running the same job across different running environments.
  • Some uncommon date formats are not supported for pushdown.

D s storage
 File Support

D s ed
editionsgdpent,gdppro,gdppr

...

  • Regex patterns used must be valid RE2. Operations on non-RE2 regex patterns are not pushed down.
  • Source metadata references such as $rownumber  and $filepath  are not supported for pushdown.
  • For more information on limitations on specific push-downs, see Flow Optimization Settings Dialog.

...

NETWORKDAYS
NETWORKDAYSINTL
MODEDATE
WORKDAY
WORKDAYINTL
MODEDATEIF
KTHLARGESTDATE
KTHLARGESTUNIQUEDATE
KTHLARGESTUNIQUEDATEIF
KTHLARGESTDATEIF
EOMONTH
SERIALNUMBER

Partially supported:

DATEFORMAT: Some uncommon formatting options are not supported for pushdown.

For more information, see Date Functions.

String functions

RIGHTFIND
EXACT
STRINGGREATERTHAN
STRINGGREATERTHANEQUAL
STRINGLESSTHAN
STRINGLESSTHANEQUAL
DOUBLEMETAPHONEEQUALS
TRANSLITERATE

...

  1. In the left nav bar, click the Jobs link. 
  2. In the Jobs Job History page, select the job that you executed. 
  3. In the Overview tab, the value for Environment under the Execution summary should be: BigQuery.

For more information, see Job Details Page.


D s also
labelrunning_environment