This section contains an archive of release notes for previous releases of |
For the latest release notes, see Release Notes for Designer Cloud Powered by Trifacta.
Release 9.1
Connectivity:
Enable connectivity between the and your cloud databases using SSH tunneling is generally available with this release.
Tip: This feature is now generally available. |
NOTE: For this release, SSH tunneling can be enabled on the following connection types: |
For more information, see Configure SSH Tunnel Connectivity.
Connectivity:
Early Preview (read-only) connections available with this release:
The can check for changes to your dataset's schemas before jobs are executed and optionally halt job execution to prevent data corruption. These options can be configured by a workspace administrator. For more information, see Workspace Settings Page.
Tip: Schema validation can be overridden for individual jobs. For more information, see Run Job Page. |
For more information, see Overview of Schema Management.
Dataset configuration:
For an imported dataset, you can configure settings through a new interface, including column names and column data types to use in the .
NOTE: This experimental feature is intended for demonstration purposes only. This feature may be modified or removed from the |
NOTE: This feature is part of a larger effort to improve how data is imported into the |
Sample Job IDs:
When a sample is collected, a job ID is generated and displayed in the . These job IDs enable you to identify the sample jobs.
Import:
For long-loading Parquet datasets, you can monitor the ingest process as you continue your work.
For more information, see Flow View Page.
Snowflake execution:
Support for splitting columns on multiple delimiters during Snowflake pushdown execution. See Snowflake Running Environment.
Performance:
A recent release introduced improved performance through intelligent caching of recipe steps.
Due to some recently discovered issues, this feature has been disabled for the time being. It cannot be enabled by a workspace administrator at this time.
NOTE: If this Beta feature had been enabled in your environment, you may experience a reduction in performance when moving between recipe steps in the Transformer page. |
The connection type now supports the
UniversalAnalytics
schema.
NOTE: Previously, this schema was called |
For more information, see Google Analytics Connections.
None.
None.
Ticket | Description | |
---|---|---|
TD-69279 | Test Connection button fails a
|
Release 9.0
Snowflake Running Environment:
For flows whose datastources and outputs are hosted in Snowflake, you can now push down execution of transformation jobs to Snowflake for much faster execution. For more information, see Snowflake Running Environment.
Dataset Schema Refresh:
You can now refresh your imported datasets with the current schema information from the source file or table. Schema refresh enables you to capture any changes to the columns in your dataset.
Connectivity:
Build connections to accessible REST API endpoints.
This feature is disabled by default. For more information about enabling REST API connectivity in your environment, please contact |
For more information, see REST API Connections.
Connectivity:
Early Preview (read-only) connections available with this release:
New configuration task for S3 connections
When you create a new workspace, the process by which you connect your workspace to S3 has improved.
Tip: Future releases will include additional improvements in connecting to AWS and S3 resources. |
For more information, see Changes to Configuration.
None.
Ticket | Description |
---|---|
TD-68162 | Flow parameters cannot be displayed or edited in the Transformer page and cannot embedded in recipe steps. |
None.
Release 8.11
Connectivity:
Early Preview (read-only) connections available with this release:
Performance:
Improvements in job execution performance, due to skipping some output validation steps for file-based outputs.
NOTE: When scheduled or API jobs are executed, no validations are performed of any writesettings objects. Issues with these objects may cause failures during transformation or publishing stages of job execution. |
Prior to this release, the size of a sample was capped at 10MB. This size represented:
Beginning in this release:
The actual size of the stored sample has increased to 40MB.
NOTE: On backend storage, sample sizes are now four times larger than in previous releases. For datasources that require decompression or conversion, actual storage sizes may exceed this 40 MB limit. |
For more information, see Change Recipe Sample Size.
Beginning in this release, for CSV outputs mismatched values are written as regular values by default. In prior releases, mismatched values were written as null values in CSV outputs.
See Improvements to the Type System.
None.
None.
Ticket | Description | ||
---|---|---|---|
TD-68162 | Flow parameters cannot be displayed or edited in the Transformer page and cannot embedded in recipe steps.
|
Release 8.10
Connectivity:
Enable connectivity between the and your cloud databases using SSH tunneling.
NOTE: SSH tunneling is enabled on a per-connection basis. For this release, SSH tunneling can be enabled on the following connection types: |
For more information, see Configure SSH Tunnel Connectivity.
Early Preview (read-only) connections available with this release:
Session Management:
You can view the current and recent sessions for your account in the . As needed, you can revoke any unfamiliar devices or sessions. For more information, see Sessions Page.
Download vCPU usage:
You can download a detailed report of your vCPU usage in CSV format.
For more information, see Workspace Usage Page.
Ingestion:
Maximum permitted record length has been increased from 1 MB to 20 MB. For more information, see Working with JSON v2.
Publishing:
Improvements to publishing of to Snowflake. For more information, see Improvements to the Type System.
Split transformation:
When splitting a column based on positions, the positions no longer need to be listed in numeric order. See Changes to the Language.
None.
None.
Ticket | Description | ||
---|---|---|---|
TD-66185 | Flatten transformation cannot handle multi-character delimiters.
|
Release 8.9
Plans:
Create plan tasks to delete files and folders from file-based backend storage.
For more information, see Create Delete Task.
Collaboration:
Connectivity:
Early Preview (read-only) connections available with this release:
Sampling:
None.
None.
Ticket | Description |
---|---|
TD-65502 | Datasets from parameters are improperly being permitted to be referenced in recipes and returns an error during job execution. |
None.
Release 8.8
Subscribe and upgrade plans via credit card:
Through the Admin console, you can now view and manage your plan, including change the plan, billing information, and the number of licensed users. You can also upgrade, downgrade, or cancel your subscription. For more information, see Plans and Billing Page.
vCPU consumption:
Administrators can now view the total workspace consumption in vCPU hours. For more information, see Workspace Usage Page.
Flow View:
Import:
Improvements have been made in how double quotes are handled in CSV files during import to align with other systems that support CSV import.
Example values in source CSV file:
"""My product""",In stock,"16,000",0.05 |
Note that the value 16,000
must be double-quoted, since the value contains a comma, which is the field delimiter.
Previously, this value appeared in the Transformer page in columns as the following:
c1 | c2 | c3 | c4 |
---|---|---|---|
"""My product""" | In stock | "16,000" | 0.05 |
As of this version, the handles the values in a better manner when displaying them in the Transformer page:
c1 | c2 | c3 | c4 |
---|---|---|---|
"My product" | In stock | 16,000 | 0.05 |
c1: Escaped values (tripe double-quotes) in the source no longer render in the application as triple double-quotes and are represented as quoted values.
c3: Note that the double quotes in c3
have been stripped. Leading and trailing quotes are trimmed if the quotes are balanced within a cell.
NOTE: This change in behavior applies only to newly created imported datasets sourced from a CSV file. Existing imported datasets should not be affected. However, if a newly imported dataset is transformed by a previously existing recipe that compensated for the extra quotes in the Transformer page, the effects on output data could be unpredictable. These recipes and their steps should be reviewed. This change does apply to any newly imported dataset sourced from CSV and may cause the data to change. For example, if you export an older flow and import into a new workspace or project, this change in parsing behavior applies to the datasets that are newly created in the new environment. Recipes may require review upon import. |
When results are generated in CSV, output files should continue to reflect the formatting of the source data before import. See above.
Tip: You can also choose the Include quotes option when creating a CSV output. |
When profiling is enabled, values that appear in CSV as ""
are now marked as missing.
API:
To prevent overloading mission-critical API endpoints, rate limiting on a select set of API endpoints has been implemented in the . For more information, see Changes to the APIs.
None.
None.
Ticket | Description |
---|---|
TD-63974 | In imported datasets sourced from CSV files, double quotes that are escaped with a backslash ( |
Release 8.7 - push 2
Templates:
From the Flows page, you can now access pre-configured templates directly from the templates gallery.
Tip: Click Templates in the Flows page. Select the template, and the template is opened in Flow View for you. |
None.
None.
None.
None.
Release 8.7
Browsers:
Plans:
Create plan tasks to deliver messages to a specified Slack channel.
For more information, see Create Slack Task.
Connectivity:
Recipe panel:
None.
API:
Ticket | Description |
---|---|
TD-63564 | Schedules created by a flow collaborator with editor access stop working if the collaborator is removed from the flow. Collaborators with viewer access cannot create schedules. |
Ticket | Description | ||
---|---|---|---|
TD-63517 | Unpivoting a String column preserves null values in Spark but converts them to empty strings in Photon. Running jobs on the different running environments generates different results.
|
Release 8.6
NOTE: Trifacta Standard product edition is no longer available. All customers of this product edition have been moved to other product editions. |
Rename Workspace
For more information, see
Flow editors and plan collaborators can be permitted to schedule jobs. See Workspace Settings Page.
Connectivity:
Early Preview (read-only) connections available with this release:
Performance:
Conversion jobs are now processed asynchronously.
Better management of file locking and concurrency during job execution.
Better Handling of JSON files
The now supports the regularly formatted JSON files during import. You can now import flat JSON records contained in a single array object. With this, each array is treated as a single line and imported as a new row. For more information, see Working with JSON v2.
None.
None.
None.
Ticket | Description | |
---|---|---|
TD-63564 | Schedules created by a flow collaborator with editor access stop working if the collaborator is removed from the flow.
Collaborators with viewer access cannot create schedules. |
Release 8.5
Parameterization:
Create environment parameters to ensure that all users of the project or workspace use consistent references.
NOTE: You must be a workspace administrator or project owner to create environment parameters. |
Tip: Environment parameters can be exported from one project or workspace and imported into another, so that these references are consistent across the enterprise. |
Flow View:
Job execution:
This feature may need to be enabled in your environment. For more information, see Workspace Settings Page.
For more information, see Create Output SQL Scripts.
Connectivity:
Contribute to the future direction of connectivity: Click I'm interested on a connection card to upvote adding the connection type to the |
Early Preview (read-only) connections available with this release:
Connectivity:
Connectivity:
Developer:
Download and install the Python SDK, which enables you to leverage the visual tools of the to transform data in your existing Python pipelines.
NOTE: This is an Alpha release. Do not deploy the Python SDK in production environments. |
For more information, see Python SDK.
Job execution:
You can choose to ignore the recipe errors before job execution and then review any errors in the recipe through the Job Details page.
Resource usage:
Language:
None.
None.
None.
Release 8.4 push 2
Template Gallery:
Connectivity:
Early Preview (read-only) connections available with this release:
None.
None.
None.
Release 8.4
Self-serve upgrades from your free trial: Through the Admin console, you can review and select the paid plan that works for you. Submit payment details through the application and begin using your upgrade workspace immediately.
Connectivity:
Early Preview (read-only) connections available with this release:
Connectivity:
Collaboration:
This feature may need to be enabled in your environment. For more information, see Workspace Settings Page.
Language:
None.
Ticket | Description |
---|---|
TD-60881 | Incorrect file path and missing file extension in the application for parameterized outputs |
TD-60382 | Date format |
None.
Release 8.3 - push 2
None.
The now uses the conversion service to ingest JSON files during import. This improved method of ingestion can save significant time wrangling JSON into records.
NOTE: The new method of JSON import is now enabled by default. It can be disabled as needed. Additional information is below. |
For more information, see Working with JSON v2.
None.
None.
Release 8.3
Single login for multiple workspaces: Login through https://cloud.trifacta.com and then choose the workspace to which to connect. If you are a member of multiple workspaces, you can switch between workspaces through the User menu. For more information, see Home Page. |
Connectivity:
Early Preview (read-only) connections available with this release:
Connectivity:
Job execution:
Introducing new filter pushdowns to optimize the performance of your flows during job execution. For more information, see Flow Optimization Settings Dialog.
Job results:
You can now preview job results and download them from the Overview tab of the Job details page. For more information, see Job Details Page.
Tip: You can also preview job results in Flow View. See View for Outputs. |
Beginning in this release, the now uses the conversion service to ingest JSON files during import. This improved method of ingestion can save significant time wrangling JSON into records.
NOTE: The new method of JSON import is disabled by default but can be enabled as needed. |
For more information, see Working with JSON v2.
Flows that use imported datasets created using the old method continue to work without modification.
NOTE: It is likely that support for the v1 version of JSON import is deprecated in a future release. You should switch to using the new version as soon as possible. For more information on migrating your flows and datasets to use the new version, see Working with JSON v1. |
Future work on support for JSON is targeted for the v2 version only.
For more information on using the old version and migrating to the new version, see Working with JSON v1.
The following API endpoints are scheduled for deprecation in a future release:
NOTE: Please avoid using the following endpoints. |
/v4/connections/vendors /v4/connections/credentialTypes /v4/connections/:id/publish/info /v4/connections/:id/import/info |
These endpoints have little value for public use.
Ticket | Description |
---|---|
TD-60701 | Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format |
TD-59854 | Datetime column from Parquet file incorrectly inferred to the wrong data type on import. |
None.
Release 8.2 - push 2
This is the initial release of the following product editions:
Upgrade: Trial customers can upgrade through the Admin console. See Admin Console. |
None.
None.
None.
Release 8.2
Preferences:
API:
Connectivity:
Early Preview (read-only) connections available with this release:
Plan metadata references:
Use metadata values from other tasks and from the plan itself in your HTTP task definitions.
Spark Execution Properties Settings:
Advanced Spark parameters have been newly added to the Spark running environment. Use these parameters to override the global Spark settings for individual jobs. For more information, see Spark Execution Properties Settings.
Improve Accessibility of Job Results:
The Jobs tabs have been enhanced to display the list of latest and the previous jobs that have been executed for the selected output.
For more information, see View for Outputs.
Sample Jobs Page:
None.
Ticket | Description |
---|---|
TD-59236 | Use of percent sign (%) in file names causes Transformer page to crash during preview. |
TD-59218 | BOM characters at the beginning of a file causing multiple headers to appear in Transformer Page. |
TD-58932 | Cannot read file paths with colons from EMR Spark jobs |
TD-58802 | Running job on Photon using zipped CSV that was uploaded yields garbage characters in output. |
TD-58694 | Very large number of files generated during Spark job execution |
Ticket | Description |
---|---|
TD-60701 | Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format. |
Release 8.1
Storage environment:
Support for use of and S3 as storage environments at the same time. For more information, see Configure Storage Environment.
NOTE: When new workspaces are created, |
Connectivity:
Introducing Early Preview connections. In each release of cloud-based product editions, new connection types may be made available in read-only mode for users to begin exploring their datasets stored in the connected datastores.
NOTE: Early Preview connection types are read-only and are subject to change before they may be made generally available. |
API:
Customize connection types (connectors) to ensure consistency across all connections of the same type and to meet your enterprise requirements. For more information, see Changes to the APIs.
Job results:
Results of data quality checks are now part of the visual profile PDF available with your job results. In the PDF, you can download the data quality results over the entire dataset .
Specify column headers during import
You can specify the column headers for your dataset during import. For more information, see Import Data Page.
None.
Ticket | Description |
---|---|
TD-56170 | The Test Connection button for some relational connection types does not perform a test authentication of user credentials. |
TD-54440 | Header sizes at intermediate nodes for JDBC queries cannot be larger than 16K. Previously, the column names for JDBC data sources were passed as part of a header in a GET request. For very wide datasets, these GET requests often exceeded 16K in size, which represented a security risk. |
None.
Release 8.1, push 2
Macro updates:
You can replace an existing macro definition with a macro that you have exported to your local desktop.
NOTE: Before you replace the existing macro, you must export a macro to your local desktop. For more information, see Export Macro. |
For more information, see Macros Page.
Early Preview connections available with this release:
None.
None.
None.
Release 8.0
Connectivity:
Support for using OAuth2 authentication for Salesforce connections. See Salesforce Connections.
Import:
Recipe development:
Metric-based data quality rules:
Update Macros:
Job execution:
APIs:
None.
Ticket | Description |
---|---|
TD-57180 | AWS jobs run on Photon to publish to HYPER format fail during file conversion or writing. |
Ticket | Description | ||
---|---|---|---|
TD-56170 | The Test Connection button for some relational connection types does not perform a test authentication of user credentials.
|