This section contains an archive of release notes for previous releases of .


For the latest release notes, see Release Notes for Designer Cloud Powered by Trifacta.

March 10, 2022

Release 9.1

What's New

Connectivity:

Connectivity:

Early Preview (read-only) connections available with this release:

Job execution:

The  can check for changes to your dataset's schemas before jobs are executed and optionally halt job execution to prevent data corruption. These options can be configured by a workspace administrator. For more information, see Workspace Settings Page.

Tip: Schema validation can be overridden for individual jobs. For more information, see Run Job Page.

For more information, see Overview of Schema Management.

Dataset configuration:

For an imported dataset, you can configure settings through a new interface, including column names and column data types to use in the .

NOTE: This experimental feature is intended for demonstration purposes only. This feature may be modified or removed from the  without warning in a future release. It should not be deployed in a production environment.

NOTE: This feature is part of a larger effort to improve how data is imported into the . This feature must be enabled by a workspace administrator.

Sample Job IDs:

When a sample is collected, a job ID is generated and displayed in the .   These job IDs enable you to identify the sample jobs.

Import:

For long-loading Parquet datasets, you can monitor the ingest process as you continue your work.

For more information, see Flow View Page.

Changes in System Behavior

Snowflake execution:

Support for splitting columns on multiple delimiters during Snowflake pushdown execution. See Snowflake Running Environment.

Performance:

A recent release introduced improved performance through intelligent caching of recipe steps.

Connectivity:

Deprecated

None.

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-69279

Test Connection button fails a ValidationFailed error when editing a working connection configured with SSH tunneling.

Workaround: Select/deselect the Enable SSH Tunneling checkbox, and then click Test Connection.

January 31, 2022

Release 9.0

What's New

Snowflake Running Environment:

For flows whose datastources and outputs are hosted in Snowflake, you can now push down execution of transformation jobs to Snowflake for much faster execution. For more information, see Snowflake Running Environment.

Dataset Schema Refresh:

You can now refresh your imported datasets with the current schema information from the source file or table. Schema refresh enables you to capture any changes to the columns in your dataset.

Connectivity:

Build connections to accessible REST API endpoints.

This feature is disabled by default. For more information about enabling REST API connectivity in your environment, please contact .

For more information, see REST API Connections.

Connectivity:

Early Preview (read-only) connections available with this release:

Changes in System Behavior

New configuration task for S3 connections

When you create a new workspace, the process by which you connect your workspace to S3 has improved.

Tip: Future releases will include additional improvements in connecting to AWS and S3 resources.

For more information, see Changes to Configuration.

Deprecated

None.

Key Bug Fixes

TicketDescription
TD-68162

Flow parameters cannot be displayed or edited in the Transformer page and cannot embedded in recipe steps.

New Known Issues

None.

January 10, 2022

Release 8.11

What's New

Connectivity:

Early Preview (read-only) connections available with this release:

Performance:

Changes in System Behavior

Sample sizes can be increased up to 40MB

Prior to this release, the size of a sample was capped at 10MB. This size represented:

Beginning in this release:

For more information, see Change Recipe Sample Size.

Data type mismatches can now be written out in CSV format

Beginning in this release, for CSV outputs mismatched values are written as regular values by default. In prior releases, mismatched values were written as null values in CSV outputs.

See Improvements to the Type System.

Deprecated

None.

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-68162

Flow parameters cannot be displayed or edited in the Transformer page and cannot embedded in recipe steps.

Workaround: To edit your flow parameters, select Parameters from the Flow View context menu.

NOTE: There is no current workaround for embedding in recipe steps. While your existing parameters should continue to work at execution time, avoid changing names of your flow parameters or editing recipe steps in which they are referenced. New flow parameters cannot be used in recipes at this time.


November 22, 2021

Release 8.10

What's New

Connectivity:

Session Management:

You can view the current and recent sessions for your account in the . As needed, you can revoke any unfamiliar devices or sessions. For more information, see Sessions Page.

Download vCPU usage:

You can download a detailed report of your vCPU usage in CSV format.

For more information, see Workspace Usage Page.

Changes in System Behavior

Ingestion:

Maximum permitted record length has been increased from 1 MB to 20 MB. For more information, see Working with JSON v2.

Publishing:

Improvements to publishing of to Snowflake. For more information, see Improvements to the Type System.

Split transformation:

When splitting a column based on positions, the positions no longer need to be listed in numeric order. See Changes to the Language.

Deprecated

None.

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-66185

Flatten transformation cannot handle multi-character delimiters.

Workaround: When a column of arrays is flattened using the running environment, multi-character String delimiters are not supported. As a workaround, you can create a regular expression delimiter, as in the following, which uses either left bracket or right bracket as the delimiter:

/[|]/

October 26, 2021

Release 8.9

What's New

Plans:

Collaboration:

Connectivity:

Sampling:

Changes in System Behavior

None.

Deprecated

None.

Key Bug Fixes

TicketDescription
TD-65502Datasets from parameters are improperly being permitted to be referenced in recipes and returns an error during job execution.

New Known Issues

None.

September 27, 2021

Release 8.8

What's New

Subscribe and upgrade plans via credit card:

Through the Admin console, you can now view and manage your plan, including change the plan, billing information, and the number of licensed users. You can also upgrade, downgrade, or cancel your subscription. For more information, see Plans and Billing Page.

vCPU consumption:

Administrators can now view the total workspace consumption in vCPU hours. For more information, see Workspace Usage Page.

Flow View:

Changes in System Behavior

Import:

Improvements have been made in how double quotes are handled in CSV files during import to align  with other systems that support CSV import. 

API:

Deprecated

None.

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-63974

In imported datasets sourced from CSV files, double quotes that are escaped with a backslash ( \"backslash-escaped value\") can cause remainder of row to be compressed into a single cell.


September 15, 2021

Release 8.7 - push 2

What's New

Templates:


From the Flows page, you can now access pre-configured templates directly from the templates gallery.

Tip: Click Templates in the Flows page. Select the template, and the template is opened in Flow View for you.

Changes in System Behavior

None.

Deprecated

None.

Key Bug Fixes

None.

New Known Issues

None.

September 1, 2021

Release 8.7

What's New

Browsers:

Plans:

Connectivity:

Recipe panel:

Changes in System Behavior

None.

Deprecated

API:

Key Bug Fixes

TicketDescription
TD-63564

Schedules created by a flow collaborator with editor access stop working if the collaborator is removed from the flow.

Collaborators with viewer access cannot create schedules.

New Known Issues

TicketDescription
TD-63517

Unpivoting a String column preserves null values in Spark but converts them to empty strings in Photon. Running jobs on the different running environments generates different results.

Workaround:  After the unpivot step, you can add an Edit with formula step. Set the columns to all of the columns in the unpivot and add the following formula, which converts all missing values to null values:

if(ismissing($col),NULL(),$col)

August 2, 2021

Release 8.6

What's New

NOTE: Trifacta Standard product edition is no longer available. All customers of this product edition have been moved to other product editions.

Rename Workspace


Collaboration:
Workspace administrators can now edit the workspace name. Select Admin console > Workspace settings
For more information, see Workspace Settings Page.

Connectivity:

Performance:

Better Handling of JSON files

The  now supports the regularly formatted JSON files during import. You can now import flat JSON records contained in a single array object. With this, each array is treated as a single line and imported as a new row. For more information, see Working with JSON v2.

Changes in System Behavior 

None.

Deprecated

None.

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-63564

Schedules created by a flow collaborator with editor access stop working if the collaborator is removed from the flow.

Tip: Flow owners can delete the schedule and create a new one. When this issue is fixed, the original schedule will continue to be executed under the flow owner's account.

Collaborators with viewer access cannot create schedules.

June 28, 2021

Release 8.5

What's New

Parameterization:

Flow View:

Job execution:

Connectivity:

Contribute to the future direction of connectivity: Click I'm interested on a connection card to upvote adding the connection type to the . See Create Connection Window.

Connectivity:

Connectivity:

Developer:

Download and install the Python SDK, which enables you to leverage the visual tools of the  to transform data in your existing Python pipelines.

NOTE: This is an Alpha release. Do not deploy the Python SDK in production environments.

For more information, see Python SDK.

Job execution:

You can choose to ignore the recipe errors before job execution and then review any errors in the recipe through the Job Details page.

Resource usage:

Language:

Changes in System Behavior

None.

Key Bug Fixes

None.

New Known Issues

None.

June 7, 2021

Release 8.4 push 2

What's New

Template Gallery:

Connectivity:

Changes in System Behavior

None.

Key Bug Fixes

None.

New Known Issues

None.

May 24, 2021

Release 8.4

What's New

Self-serve upgrades from your free trial: Through the Admin console, you can review and select the paid plan that works for you. Submit payment details through the application and begin using your upgrade workspace immediately.

Connectivity:

Connectivity:

Collaboration:

Language:

Changes in System Behavior

None.

Key Bug Fixes

TicketDescription
TD-60881Incorrect file path and missing file extension in the application for parameterized outputs
TD-60382

Date format M/d/yy is handled differently by PARSEDATE function on and Spark.

New Known Issues

None.

May 5, 2021

Release 8.3 - push 2

What's New

None.

Changes in System Behavior

Improved method of JSON import

The  now uses the conversion service to ingest JSON files during import. This improved method of ingestion can save significant time wrangling JSON into records.

NOTE: The new method of JSON import is now enabled by default. It can be disabled as needed. Additional information is below.

For more information, see Working with JSON v2.

Key Bug Fixes

None.

New Known Issues

None.

April 26, 2021

Release 8.3

What's New

Single login for multiple workspaces: Login through https://cloud.trifacta.com and then choose the workspace to which to connect. If you are a member of multiple workspaces, you can switch between workspaces through the User menu. For more information, see Home Page.

Connectivity:

Connectivity:

You can publish results to external S3 buckets through an Access key and Secret key. For more information, see External S3 Connections

Job execution:

Introducing new filter pushdowns to optimize the performance of your flows during job execution. For more information, see Flow Optimization Settings Dialog.

Job results:

You can now preview job results and download them from the Overview tab of the Job details page. For more information, see Job Details Page.

Tip: You can also preview job results in Flow View. See View for Outputs.

Changes in System Behavior

Improved method of JSON import

Beginning in this release, the  now uses the conversion service to ingest JSON files during import. This improved method of ingestion can save significant time wrangling JSON into records.

NOTE: The new method of JSON import is disabled by default but can be enabled as needed.

For more information, see Working with JSON v2.

Flows that use imported datasets created using the old method continue to work without modification.

NOTE: It is likely that support for the v1 version of JSON import is deprecated in a future release. You should switch to using the new version as soon as possible. For more information on migrating your flows and datasets to use the new version, see Working with JSON v1.

Future work on support for JSON is targeted for the v2 version only.

For more information on using the old version and migrating to the new version, see Working with JSON v1.

API

The following API endpoints are scheduled for deprecation in a future release:

NOTE: Please avoid using the following endpoints.

/v4/connections/vendors
/v4/connections/credentialTypes
/v4/connections/:id/publish/info
/v4/connections/:id/import/info

These endpoints have little value for public use.

Key Bug Fixes

TicketDescription
TD-60701Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format
TD-59854Datetime column from Parquet file incorrectly inferred to the wrong data type on import.

New Known Issues

None.

March 31, 2021

Release 8.2 - push 2

What's New

This is the initial release of the following product editions:

Upgrade: Trial customers can upgrade through the Admin console. See Admin Console.

Changes in System Behavior

None.

Key Fixes

None.

New Known Issues

None.

March 22, 2021

Release 8.2

What's New

Preferences:

API:

Connectivity:

Plan metadata references:

Use metadata values from other tasks and from the plan itself in your HTTP task definitions.

Spark Execution Properties Settings:

Advanced Spark parameters have been newly added to the Spark running environment. Use these parameters to override the global Spark settings for individual jobs. For more information, see Spark Execution Properties Settings.

Improve Accessibility of Job Results:

The Jobs tabs have been enhanced to display the list of latest and the previous jobs that have been executed for the selected output.

For more information, see View for Outputs.

Sample Jobs Page:

You can monitor the status of all sample jobs that you have generated. Project administrators can access all sample jobs in the workspace. For more information, see Sample Jobs Page.

Changes in System Behavior

None.

Key Fixes

TicketDescription
TD-59236Use of percent sign (%) in file names causes Transformer page to crash during preview.
TD-59218BOM characters at the beginning of a file causing multiple headers to appear in Transformer Page.
TD-58932Cannot read file paths with colons from EMR Spark jobs
TD-58802Running job on Photon using zipped CSV that was uploaded yields garbage characters in output.
TD-58694Very large number of files generated during Spark job execution

New Known Issues

TicketDescription
TD-60701Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format.

February 26, 2021

Release 8.1

What's New

Storage environment:

Connectivity:


API:


Customize connection types (connectors) to ensure consistency across all connections of the same type and to meet your enterprise requirements. For more information, see Changes to the APIs.

Job results:

Results of data quality checks are now part of the visual profile PDF available with your job results. In the PDF, you can download the data quality results over the entire dataset .

Specify column headers during import

You can specify the column headers for your dataset during import. For more information, see Import Data Page.

Changes in System Behavior

None.

Key Bug Fixes

TicketDescription
TD-56170The Test Connection button for some relational connection types does not perform a test authentication of user credentials.
TD-54440

Header sizes at intermediate nodes for JDBC queries cannot be larger than 16K.

Previously, the column names for JDBC data sources were passed as part of a header in a GET request. For very wide datasets, these GET requests often exceeded 16K in size, which represented a security risk.

New Known Issues

None.

March 9, 2021

Release 8.1, push 2

What's New

Macro updates:

You can replace an existing macro definition with a macro that you have exported to your local desktop.

NOTE: Before you replace the existing macro, you must export a macro to your local desktop. For more information, see Export Macro.

For more information, see Macros Page.

Early Preview connections available with this release:

Changes in System Behavior

None.

Key Fixes

None.

New Known Issues

None.


January 26, 2021

Release 8.0

What's New

Connectivity:

Import:

Recipe development:

Metric-based data quality rules:

Update Macros:

Job execution:

APIs:

Changes in System Behavior

None.

Key Bug Fixes

TicketDescription
TD-57180

AWS jobs run on Photon to publish to HYPER format fail during file conversion or writing.

New Known Issues

TicketDescription
TD-56170

The Test Connection button for some relational connection types does not perform a test authentication of user credentials.

Workaround: Append the following to your Connect String Options:

;ConnectOnOpen=true

This option forces the connection to validate user credentials as part of the connection. There may be a performance penalty when this option is used.