D toc |
---|
Release 8.2.
...
1
March 25August 13, 20222021
What's New
Databricks:
Support for Databricks 7.x and 8.x.
Info NOTE: Databricks 7.3 and Databricks 8.3 are recommended.
- Support for Databricks cluster creation via cluster policies.
- Store a user-defined set of secret information such as credentials in Databricks Secrets.
- For more information, see Configure for Azure Databricks.
- For more information, see Configure for AWS Databricks.
Changes in System Behavior
Publishing:
Improvements to publishing of
D s item | ||
---|---|---|
|
Nginx:
- Upgraded to Nginx 1.20.1.
Deprecated
None.
Key Bug Fixes
...
Ticket
...
Description
...
TD-69201
...
Vulnerability scan detected compromised versions of log4j on the Trifacta Hadoop dependency jars
...
TD-69052
...
Job fails using Spark when using parameterized files as input
...
TD-69004
...
Patch httpd to version 2.4.52
...
TD-68085
...
D s product |
---|
...
TD-67953
...
Remove log4j dependencies from Java projects
...
TD-67747
...
CVE-2021-44832: Apache Log4j2 vulnerable to RCE via JDBC Appender when attacker controls configuration
...
TD-67677
...
EMR spark job fails with error "org.apache.spark.sql.AnalysisException: Cannot resolve column name" if flow optimizations are enabled.
...
TD-67640
...
Intermittent failure to publish to Tableau in Fileconverter.
...
TD-67572
...
EMR spark job fails with error "org.apache.spark.sql.AnalysisException: Cannot resolve column name"
...
TD-67558
...
CVE-2021-45105: Log4j vulnerability (denial of service)
...
TD-67531
...
Glue jobs not working after upgrade to Release 8.2
...
TD-67455
...
CVE-2021-45046: Log4j vulnerability
...
TD-67410
...
CVE-2021-23017: Nginx v.1.20.0 security vulnerability
...
TD-67388
...
Nest function failing
...
TD-67372
...
Patch/update Log4J (RCE 0-day exploit found in log4j)
...
TD-67329
...
Publish failing with "java.io.IOException: No FileSystem for scheme: sftp"
...
TD-66779
...
Output home directory is not picked correctly for job runs in wasb/adls-gen2
...
TD-66160
...
SSLHandshakeException when accessing Databricks table
...
TD-66025
...
Glue connection not working on after upgrade to Release 8.2
...
TD-65696
...
In Azure environment, changing the user output/upload directory only persists the path and not the container name/account storage.
...
TD-65331
...
Writing to ADLS failing in SSL Handshake to TLSv1.1
...
TD-65286
...
D s item | ||
---|---|---|
|
...
TD-65058
...
Unable to upgrade due to migration failure
...
TD-64627
...
D s product |
---|
...
TD-64528
...
Upgrade to Release 8.2 failed to load dictionaries
...
TD-64281
...
/change-password page fails to load.
...
TD-64171
...
Cannot import parameterized datasets that include files with zero and non-zero byte sizes together.
...
TD-63981
...
Start/stop scripts should not modify any config/database settings during startup.
...
TD-63867
...
Jobs are not triggering for a parameterized datasets with zero-byte file sizes.
...
TD-63493
...
Unable to cancel a plan run
...
TD-60881
...
Incorrect path shown when using parameterized output path
...
TD-59706
...
No vertical scroll when there are too many connections on Import page
...
TD-58576
...
Cannot read property 'expandScriptLines' of undefined when flow node's activeSampleId is pointing to failed (null) sample.
New Known Issues
None.
Release 8.2.1
August 13, 2021
...
EMR:
- Updated supported versions of EMR to address log4j2 issues. Please upgrade to EMR 5.30.2.
- For more information, see https://docs.trifacta.com/display/PUB/Trifacta+Alert+TD-67372+-+0-Day+Exploit+in+log4j2+for+Self+Managed.
Databricks:
- Support for Databricks 8.3.
- For more information, see Configure for Azure Databricks.
- For more information, see Configure for AWS Databricks.
- Support for per-user access to ADLS Gen2 for running Databricks jobs. For more information, see Enable ADLS Gen2 Access.
: D s node
- NodeJS upgraded to 14.16.0.
Changes in System Behavior
None.
Key Bug Fixes
Ticket | Description | ||||
---|---|---|---|---|---|
TD-62689 | Nginx returns Bad Request Status: 400 error, due to duplicate entries in
|
New Known Issues
None.
Release 8.2
June 11, 2021
What's New
Preferences:
- Re-organized user account, preferences, and storage settings to streamline the setup process. See Preferences Page.
API:
Connectivity:
- Support for custom SQL queries on Databricks Tables. See Create Databricks Tables Connections.
...
- For more information, see Create HTTP Task.
- For more information, see Plan Metadata References.
Improved accessibility of job results:
The Jobs tabs have been enhanced to display the list of latest and the previous jobs that have been executed for the selected output.
...
Users and groups:
The AD users and groups integration is now generally available. See Configure Users and Groups.
Install:
Support for Nginx 1.20.0 on the
D s node |
---|
Changes in System Behavior
Java service classpath changes:
...
Ticket | Description |
---|---|
TD-59854 | Datetime column from Parquet file incorrectly inferred to the wrong data type on import. |
TD-59658 | IAM roles passed through SAML does not update after Hotfix upgrade |
TD-59633 | Enabled session tag feature but running into "The security token included in the request is invalid" error |
TD-59331 | When include quotes option is disabled on an output, Databricks still places quotes around empty values. |
TD-59128 | BOM characters at the beginning of a file causing multiple headers to appear in Transformer Page. |
TD-58932 | Cannot read file paths with colons from EMR Spark jobs |
TD-58694 | Very large number of files generated during Spark job execution |
TD-58523 | Cannot import dataset with filename in Korean alphabet from HDFS. |
New Known Issues
Ticket | Description |
---|---|
TD-60701 | Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format. |
Release 8.1
February 26, 2021
What's New
Tip | |
---|---|
In-app messaging: Be sure to check out the new in-app messaging feature, which allows us to share new features and relevant content to |
Install:
Support for PostgreSQL 12.X for
on all supported operating systems.D s item item databases Info NOTE: Beginning in this release, the latest stable release of PostgreSQL 12 can be installed with the
. Earlier versions of PostgreSQL 12.X can be installed manually.D s platform Info NOTE: Support for PostgreSQL 9.6 is deprecated for customer-managed Hadoop-based deployments and AWS deployments. PostgreSQL 9.6 is supported only for Azure deployments. When Azure supports PostgreSQL 12 or later, support for PostgreSQL 9.6 will be deprecated in the subsequent release of
.D s product - For more information, see Install Databases for PostgreSQL.
- See Product Support Matrix.
...
Define permissions on individual objects when they are shared.
Info NOTE: Fine-grained sharing permissions apply to flows and connections only.
For more information, see Changes to User Management.
API:
- Apply job-level overrides to AWS Databricks or Azure Databricks job executions via API. See API Workflow - Run Job.
...
- Customize connection types (connectors) to ensure consistency across all connections of the same type and to meet your enterprise requirements. For more information, see Changes to the APIs.
Running environment:
- Support for Databricks as a running environment for
hosted in AWS.D s product - For more information, see Configure for AWS Databricks.
- Integration with AWS Secrets Manager is required. For more information, see Configure for AWS Secrets Manager.
- For more information, see Configure for AWS Databricks.
- Support for connections to Databricks Tables from AWS Databricks. See Create Databricks Tables Connections.
- Support for job throttling for user clusters.
- Support for custom driver-specific instance pools on Databricks.
Publishing:
- Support for publishing to Datetime values in Parquet outputs. For more information, see Improvements to the Type System.
Macro updates:
You can replace an existing macro definition with a macro that you have exported to your local desktop.
Info |
---|
NOTE: Before you replace the existing macro, you must export a macro to your local desktop. For more information, see Export Macro. |
...
For more information, seeSpecify column headers during import
You can specify the column headers for your dataset during import. For more information, see Import Data Page.
...
- The Secure Token Service is used for managing tokens for third-party systems, such as Azure Key Vault and OAuth 2.0 authentication. See Configure Secure Token Service.
- The Connector Configuration Service manages the storage and retrieval of connection type information for the workspace. See Configure Connector Configuration Service.
Changes in System Behavior
Info |
---|
NOTE: CDH 6.1 is no longer supported. Please upgrade to the latest supported version. For more information, see Product Support Matrix. |
Info |
---|
NOTE: HDP 2.6 is no longer supported. Please upgrade to the latest supported version. For more information, see Product Support Matrix. |
Support for custom data types based on dictionary files to be deprecated:
Info |
---|
NOTE: The ability to upload dictionary files and use their contents to define custom data types is scheduled for deprecation in a future release. This feature is limited and inflexible. Until an improved feature can be released, please consider using workarounds. For more information, see Validate Your Data. You can create custom data types using regular expressions. For more information, see Create Custom Data Types. |
...
Installation of database client is now required:
Before you install or upgrade the database or perform any required database cross-migrations, you must install the appropriate database client first.
Info | |
---|---|
NOTE: Use of the database client provided with each supported database distribution is now a required part of any installation or upgrade of the
|
For more information:
Job logs collected asynchronously for Databricks jobs:
...
Integrations between
D s product |
---|
Key Bug Fixes
Ticket | Description | ||
---|---|---|---|
TD-56170 | The Test Connection button for some relational connection types does not perform a test authentication of user credentials. | ||
TD-54440 | Header sizes at intermediate nodes for JDBC queries cannot be larger than 16K. Previously, the column names for JDBC data sources were passed as part of a header in a GET request. For very wide datasets, these GET requests often exceeded 16K in size, which represented a security risk. The solution is to turn these GET requests into ingestion jobs.
|
New Known Issues
Ticket | Description | ||||||
---|---|---|---|---|---|---|---|
TD-58818 | Cannot run jobs on some builds HDP 2.6.5 and later. There is a known incompatibility between HDP 2.6.5.307-2 and later and the Hadoop bundle JARs that are shipped with the
| ||||||
TD-58523 | Cannot import dataset with filename in Korean alphabet from HDFS.
| ||||||
TD-55299 | Imported datasets with encodings other than UTF-8 and line delimiters other than
| ||||||
TD-51516 | Input data containing BOM (byte order mark) characters may cause Spark or
|
Release 8.0
January 26, 2021
What's New
APIs:
- Individual workspace users can be permitted to create and use their own access tokens for use with the REST APIs. For more information, see Workspace Settings Page.
...
- Support for connections to SharePoint Lists. See Create SharePoint Connections.
Support for using OAuth2 authentication for Salesforce connections.
Info NOTE: Use of OAuth2 authentication requires additional configuration. For more information, see OAuth 2.0 for Salesforce.
- Support for re-authenticating through connections that were first authenticated using OAuth2.
Import:
- Improved method for conversion and ingestion of XLS/XSLX files. For more information, see Import Excel Data.
...
- The Flag for Review feature enables you to set review checkpoints in your recipes. You can flag recipe steps for review by other collaborators for review and approval. For more information, see Flag for Review.
Update Macros:
- Replace / overwrite an existing macro's steps and inputs with a newly created macro.
- Map new macro parameters to the existing parameters before replacing.
- Edit macro input names and default values as needed.
- For more information, see Create or Replace Macro.
- For more information, see Overview of Macros.
- For more information, see Create or Replace Macro.
Job execution:
- You can enable the
to apply SQL filter pushdowns to your relational datasources to remove unused rows before their data is imported for a job execution. This optimization can significantly improve performance as less data is transferred during the job run. For more information, see Flow Optimization Settings Dialog.D s webapp - Optimizations that were applied during the job run now appear in the Job Details Page. See Job Details Page.
Changes in System Behavior
None.
Key Bug Fixes
Ticket | Description |
---|---|
TD-57354 | Cannot import data from Azure Databricks. This issue is caused by an incompatibility between TLS v1.3 and Java 8, to which it was backported. |
TD-57180 | AWS jobs run on Photon to publish to HYPER format fail during file conversion or writing. |
New Known Issues
Ticket | Description | ||||
---|---|---|---|---|---|
TD-56170 | The Test Connection button for some relational connection types does not perform a test authentication of user credentials.
|
Release 7.10
December 21, 2020
What's New
Tip |
---|
Tip: Check out the new in-app tours, which walk you through the steps of wrangling your datasets into clean, actionable data. |
...
Improved Salesforce connection type.
For more information, see Create Salesforce Connections.
Language:
- New function for calculating end-of-month values. See Changes to the Language.
...
- NOTE: This feature can be changed or removed from the platform at any time without notice. Do not deploy it in a production environment.
- For more information, see API Workflow - Wrangle Output to Python.
Changes in System Behavior
Rebuild custom UDF JARs for Databricks clusters
...
For more information, see Improvements to the Type System.
Key Bug Fixes
Ticket | Description |
---|---|
TD-54742 | Access to S3 is disabled after upgrade. |
TD-53527 | When importing a dataset via API that is sourced from a BZIP file stored on S3, the columns may not be properly split when the platform is permitted to detect the structure. |
New Known Issues
Ticket | Description | ||
---|---|---|---|
TD-57180 | AWS jobs run on Photon to publish to HYPER format fail during file conversion or writing.
| ||
TD-56830 | Receive
|
Release 7.9
November 16, 2020
What's New
Plan View:
- Execute Plan using status rules: Starting in Release 7.9, you can execute tasks based on the previous task execution result. For more information, see Create a Plan.
- Execute Parallel Plan tasks: In previous releases, plans were limited to a sequential order of task execution. Beginning in Release 7.9, you can create branches in the graph into separate parallel nodes, enabling the corresponding tasks to run in parallel. This feature enables you to have a greater level of control of your plans' workflows. For more information, see Create a Plan.
- Zoom options: Zoom control options and keyboard shortcuts have been introduced in the plan canvas. For more information, see Plan View Page.
- Filter Plan Runs: Filter your plan runs based on dates or plan types. For more information, see Plan Runs Page.
...
- An All option has been added for selecting columns in the Transform Builder. For more information, see Changes to the Language page.
Changes in System Behavior
Manage Users section has been deprecated:
...
- The dependencies tab is renamed as dependency graph tab.
- The old flow view in the dependency graph tab is replaced with the new flow view. For more information, see Job Details Page.
Key Bug Fixes
Ticket | Description |
---|---|
TD-55125 | Cannot copy flow. However, export and import of the flow enables copying. |
TD-53475 | Missing associated artifact error when importing a flow. |
New Known Issues
None.
Release 7.8
October 19, 2020
What's New
Plans:
Create HTTP tasks for your plans, which can be configured to issue a request to an API endpoint over HTTP.
- For more information, see Plan View for HTTP Tasks.
- For more information on plans, see Overview of Operationalization.
- The viewport position and zoom level are now preserved when returning to a given flow.
Publishing:
- Improved performance when publishing to Tableau Server.
- Configure publishing chunk sizes as needed. For more information, see Configure Data Service.
Language:
- Rename columns now supports uppercase or lowercase characters or shorten column names to a specified character length from the left or right. For more information, see Changes to the Language.
Connectivity:
IAM support for Redshift connections.
Info NOTE: To enable use of an existing IAM role for Redshift, additional permissions must be added. For more information, see Required AWS Account Permissions.
For more information, see Create Redshift Connections.
Changes in System Behavior
JDBC connection pooling disabled:
...
Enhanced Flow and Flow View menu options:
The context menu options for Flow View and Flow have been renamed and reorganized for a better user experience.
- For more information, see Flows Page.
- For more information, see Flow View Page.
Key Bug Fixes
None.
New Known Issues
Ticket | Description | ||
---|---|---|---|
TD-54030 | When creating custom datasets from Snowflake, columns containing time zone data are rendered as null values in visual profiles, and publishing back to Snowflake fails.
|
Release 7.7
September 21. 2020
What's New
Flow View:
- Automatically organize the nodes of your flow with a single click. See Flow View Page.
Changes in System Behavior
Deprecated Parameter History Panel Feature
...
For more information, see Flow View Page.
Key Bug Fixes
Ticket | Description |
---|---|
TD-53318 | Cannot publish results to relational targets when flow name or output filename or table name contains a hyphen (e.g. my - filename.csv). |
New Known Issues
None.