Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version r089

...

Connectivity:

  • Early Preview (read-only) connections available with this release:

    D s ed
    rtrue
    editionsgdpent,gdppro,gdppr

Sampling:

  • Adjust the size of samples loaded in the browser for your current recipe to improve performance and address low-memory conditions. See Change Recipe Sample Size.

Changes in System Behavior

None.

Deprecated

None.

...

Changes in System Behavior

None.

Deprecated

None.

Key Bug Fixes

TicketDescription
TD-65502Datasets from parameters are improperly being permitted to be referenced in recipes and returns an error during job execution.

New Known Issues

None.

October 12, 2021

Release 8.8

What's New

Project Usage:

  • VCU usage and active users are now displayed in the 
    D s webapp
     for administrators. For more information, see Usage Page.

D s photon
:

  • You can now configure the 
    D s webapp
     to execute 
    D s photon
     jobs in your VPC.
    D beta

    D s ed
    editionsgdpent

    For more information, please contact 
    D s support
    .

Changes

Cancellation of jobs is temporarily disabled:

In previous releases, you could cancel in-progress jobs for flow and sampling jobs through the

D s webapp
. As of this release, canceling of job types, such as sampling, transformation, and profiling jobs, is temporarily disabled.

Info

NOTE: This change applies to all types of jobs executed across all running environments, including BigQuery. For plan runs, some jobs, such as flow tasks, may continue to completion before the plan is canceled.

Tip

Tip: For

D s dataflow
jobs, you can still cancel them through the
D s dataflow
interface in
D s platform
.

Job cancellation may be re-enabled in the future.

Billing:

Charges for your project and user usage of 

D s product
 are applied to your account based on the UTC (Greenwich) time zone. However, Google Marketplace tracks and reports usage based on the Pacific (U.S. West Coast) time zone, so some discrepancies in reporting have been observed.

Beginning at the end of October 2021, these discrepancies will be addressed. The daily reporting interval will be changed to start and end at midnight Pacific time to match how Google Marketplace reports. However, the usage tracking will remain based on the UTC time zone.

Info

NOTE: During the year, UTC time may mean:

  • Pacific time zone is UTC-07:00 during daylight savings time.
  • Pacific time zone is UTC-08:00 during standard time.

vCPU usage has been tracked on an hourly basis and will be unchanged.

For more information, see Usage Page.

Import:

Improvements have been made in how double quotes are handled in CSV files during import to align 

D s product
 with other systems that support CSV import. 

  • Example values in source CSV file:

    Code Block
    """My product""",In stock,"16,000",0.05

    Note that the value 16,000 must be double-quoted, since the value contains a comma, which is the field delimiter.

  • Previously, this value appeared in the Transformer page in columns as the following:

    c1c2c3c4
    """My product"""
    In stock
    "16,000"
    0.05
  • As of this version, the 

    D s webapp
     handles the values in a better manner when displaying them in the Transformer page:

    c1c2c3c4
    "My product"
    In stock
    16,000
    0.05
    • c1: Escaped values (tripe double-quotes) in the source no longer render in the application as triple double-quotes and are represented as quoted values.

    • c3: Note that the double quotes in c3 have been stripped. Leading and trailing quotes are trimmed if the quotes are balanced within a cell.

      Info

      NOTE: This change in behavior applies only to newly created imported datasets sourced from a CSV file. Existing imported datasets should not be affected. However, if a newly imported dataset is transformed by a previously existing recipe that compensated for the extra quotes in the Transformer page, the effects on output data could be unpredictable. These recipes and their steps should be reviewed.

      This change does apply to any newly imported dataset sourced from CSV and may cause the data to change. For example, if you export an older flow and import into a new workspace or project, this change in parsing behavior applies to the datasets that are newly created in the new environment. Recipes may require review upon import.

  • When results are generated in CSV, output files should continue to reflect the formatting of the source data before import. See above.

    Tip

    Tip: You can also choose the Include quotes option when creating a CSV output.

    • When profiling is enabled, values that appear in CSV as "" are now marked as missing.

API:

  • To prevent overloading mission-critical API endpoints, rate limiting on a select set of API endpoints has been implemented in the

    D s platform
    . For more information, see Changes to the APIs.

BigQuery Running Environment:

When running jobs in BigQuery, some additional data types, functions, and transformations are now supported:

  • Data types: The following data types are now supported for execution in BigQuery:

    • Arrays
    • Objects (Maps)
  • Aggregation functions: 
  • Date functions:
    • WEEKNUM
    • CONVERTFROMUTC
    • CONVERTTOUTC
    • CONVERTTIMEZONE
    • DATEDIF: All unit types are now supported.
    • See Date Functions.
  • String functions:

  • Nested functions:

    • ARRAYCONCAT
    • ARRAYCROSS
    • ARRAYINTERSECT
    • ARRAYLEN
    • ARRAYSTOMAP
    • ARRAYUNIQUE
    • ARRAYZIP
    • FILTEROBJECT
    • KEYS
    • ARRAYELEMENTAT
    • LISTAVERAGE
    • LISTMAX
    • LISTMIN
    • LISTMODE
    • LISTSTDEV
    • LISTSUM
    • LISTVAR
    • ARRAYSORT
    • ARRAYINDEXOF
    • ARRAYMERGEELEMENTS
    • ARRAYRIGHTINDEXOF
    • ARRAYSLICE
    • See Nested Functions.
  • Other functions:
  • Transformations:

    Search term

    Transform

    Unnest elementsunnest
    Expand Array to rowsflatten
    Extract between delimitersextractbetweendelimiters
    Unpivotunpivot
    Standardize columnstandardize
    Nest columnsnest
    Extract matches to Arrayextractlist
    Replace between delimitersreplacebetweenpatterns
    Scale to min maxscaleminmax
    Scale to meanscalestandardize
    Convert key/value to Objectextractkv
    JoinJoin datasets
    For more information, see Join Types.
    • Legend:

      • Search term: the value you enter in the Transform Builder
      • Transform: name of the underlying transform
      • For more information, see Transformation Reference.

BigQuery Running Environment:

Deprecated

None.

Known Issues

None.

Fixes

TicketDescription
TD-65502Datasets from parameters are improperly being permitted to be referenced in recipes and returns an error during job execution.

New Known Issues

None.

-64383

D s dataflow
jobs that use custom SQL to query an authorized view may fail when the Service Account in use has access to the authorized view but no access to underlying BigQuery table.

September 15, 2021

Release 8.7

What's New

Templates:

D s ed
rtrue
editionsgdpent,gdppro,gdpsta,gdppr,gdpst

...

Changes

None.

Deprecated

API:

  • Deprecated API endpoint to transfer assets between users has been removed from the platform. This endpoint was previously replaced by an improved method of transfer.
  • Some connection-related endpoints have been deprecated. These endpoints have little value for public use.
  • For more information, see Changes to the APIs.

Known Issues

TicketDescription
TD-63517

Unpivoting a String column preserves null values in

D s dataflow
but converts them to empty strings in Photon. Running jobs on the different running environments generates different results.

Tip

Workaround:  After the unpivot step, you can add an Edit with formula step. Set the columns to all of the columns in the unpivot and add the following formula, which converts all missing values to null values:

Code Block
if(ismissing($col),NULL(),$col)



Fixes

TicketDescription
TD-63564

Schedules created by a flow collaborator with editor access stop working if the collaborator is removed from the flow.

Collaborators with viewer access cannot create schedules.


August 16, 2021

Release 8.6

What's New

Template Gallery:

Tip

Tip:  You can start a trial account by selecting a pre-configured template from our templates gallery. See  www.trifacta.com/templates.  

Collaboration:

...

Better Handling of JSON files:

The 

D s webapp
 now supports the regularly formatted JSON files during import. You can now import flat JSON records contained in a single array object. With this, each array is treated as a single line and imported as a new row. For more information, see Working with JSON v2

Usage reporting:

Detailed reporting on vCPU and active users is now available in the 

D s webapp
.

Info

NOTE:  Active user reporting may not be available until September 1, 2021 or later.

For more information, see Usage Page.

Changes

D s dataflow
 machines:

  • The following machine types are now available when running a 

    D s dataflow
     job:

    Code Block
    "e2-standard-2",
    "e2-standard-4",
    "e2-standard-8",
    "e2-standard-16",
    "e2-standard-32"

Deprecated

None.

Known Issues

  • TD-63564: Schedules created by a flow collaborator with editor access stop working if the collaborator is removed from the flow.

    • Tip: Flow owners can delete the schedule and create a new one. When this issue is fixed, the original schedule will continue to be executed under the flow owner's account.

    • Collaborators with viewer access cannot create schedules.

Fixes

  • TD-61478: Time-based data types are imported as String type from BigQuery sources when type inference is disabled.

July 20, 2021

Release 8.5

What's New

Tip

Tip: When you complete your

D s product
productgdpent
or
D s product
productgdppro
trial, you can choose to license a higher or lower tier product edition. For more information, see Product Editions.

...

  • Review the total vCPU hours consumed by your datasets, recipes, and job execution within your project across an arbitrary time period. 

Changes

None.

Deprecated

None.

Known Issues

None.

Fixes

  • TD-62190: You may not be able to view the SQL that was used to execute a job within BigQuery. This issue is due to a regression in the new BigQuery console in which job identifiers containing dashes are not supported. A ticket has been filed with Google.

June 7, 2021

Release 8.4

What's New

Template Gallery:

  • Check out the new gallery of flow templates, which can be imported into your workspace. These templates are pre-configured to solve the most compelling loading and transformation use cases in the product. For more information, see www.trifacta.com/templates.
    • For more information on importing flows into your workspace, see Import Flow.
    • For more information on using a template in the product, see Start with a Template

...

Changes

D s photon
limits on execution time

...

In conjunction with the previous change, execution of scheduled jobs is not supported on

D s photon
. Since
D s photon
jobs are now limited to 10 minutes of execution time, scheduled jobs have been automatically migrated to execution on
D s dataflow
to provide better execution success. For more information, see Trifacta Photon Running Environment.

Deprecated

None.

Known Issues

  • TD-62190: You may not be able to view the SQL that was used to execute a job within BigQuery. This issue is due to a regression in the new BigQuery console in which job identifiers containing dashes are not supported. A ticket has been filed with Google.

Fixes

  • TD-60881:  Incorrect file path and missing file extension in the application for parameterized outputs
  • TD-60382: Date format M/d/yy is handled differently by PARSEDATE function on
    D s photon
    and Spark.

May 20, 2021

Release 8.3 - push 3

What's New

Connectivity:

  • Support for SFTP connections.

    D s ed
    editionsgdpent,gdppro,gdppr

    Info

    NOTE: This connection type is import only.

    For more information, see SFTP Connections.

Changes

D s photon
enabled by default

...

D s photon
can be enabled or disabled by a project administrator. For more information, see Dataprep Project Settings Page.

Deprecated

None.

Known Issues

None.

Fixes

None.

May 10, 2021

Release 8.3

What's New

Running Environments:

...

Tip

Tip: You can also preview job results in Flow View. See View for Outputs.

Changes

Improved method of JSON import

...

For more information on using the old version and migrating to the new version, see Working with JSON v1.

Deprecated

None.

Known Issues

  • TD-61478: Time-based data types are imported as String type from BigQuery sources when type inference is disabled.

Fixes

  • TD-60701: Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format.
  • TD-59854: Datetime column from Parquet file incorrectly inferred to the wrong data type on import.

April 26, 2021

Release 8.2 push2

What's New

Tip

Upgrade: Trial customers can upgrade through the Admin console. See Admin Console.

...

  • D s product
    productgdpent
  • D s product
    productgdppro
  • D s product
    productgdpsta

Changes

None.

Deprecated

None.

Known Issues

None.

Fixes

None.

April 14, 2021

Release 8.2

...

  • D s product
    productgdpent
  • D s product
    productgdppro
  • D s product
    productgdpsta

What's New

Photon:

Introducing

D s photon
, an in-memory running environment for running jobs. Embedded in the
D s product
,
D s photon
delivers improved performance in job execution and is best-suited for small- to medium-sized jobs.

...

From the Home Page, you can quickly redesign your output and destination experience. The step-by-step procedures enables you to create an improved and streamlined output creation experience. For more information, see Start with a Template.

Changes

Improved methods for disabling the product:

...

These endpoints have little value for public use.

Deprecated

None.

Known Issues

  • TD-60701: Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format.

Fixes

  • TD-59236:  Use of percent sign (%) in file names causes Transformer page to crash during preview.
  • TD-59218:  BOM characters at the beginning of a file causing multiple headers to appear in Transformer Page.

...