Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

D toc

Release 8.2.

...

1

March 25August 13, 20222021

What's New

Databricks:

  • Support for Databricks 7.x and 8.x.

    Info

    NOTE: Databricks 7.3 and Databricks 8.3 are recommended.

  • Support for Databricks cluster creation via cluster policies.
  • Store a user-defined set of secret information such as credentials in Databricks Secrets.

Changes in System Behavior

Publishing:

Improvements to publishing of

D s item
itemDate values
to Snowflake. For more information, see Improvements to the Type System.

Nginx:

  • Upgraded to Nginx 1.20.1.

Deprecated

None.

Key Bug Fixes

...

Ticket

...

Description

...

TD-69201

...

Vulnerability scan detected compromised versions of log4j on the Trifacta Hadoop dependency jars

...

TD-69052

...

Job fails using Spark when using parameterized files as input

...

TD-69004

...

Patch httpd to version 2.4.52

...

TD-68085

...

D s product
unavailable due to update lock on plantasksnapshotruns

...

TD-67953

...

Remove log4j dependencies from Java projects

...

TD-67747

...

CVE-2021-44832: Apache Log4j2 vulnerable to RCE via JDBC Appender when attacker controls configuration

...

TD-67677

...

EMR spark job fails with error "org.apache.spark.sql.AnalysisException: Cannot resolve column name" if flow optimizations are enabled.

...

TD-67640

...

Intermittent failure to publish to Tableau in Fileconverter.

...

TD-67572

...

EMR spark job fails with error "org.apache.spark.sql.AnalysisException: Cannot resolve column name"

...

TD-67558

...

CVE-2021-45105: Log4j vulnerability (denial of service)

...

TD-67531

...

Glue jobs not working after upgrade to Release 8.2

...

TD-67455

...

CVE-2021-45046: Log4j vulnerability

...

TD-67410

...

CVE-2021-23017: Nginx v.1.20.0 security vulnerability

...

TD-67388

...

Nest function failing

...

TD-67372

...

Patch/update Log4J (RCE 0-day exploit found in log4j)

...

TD-67329

...

Publish failing with "java.io.IOException: No FileSystem for scheme: sftp"

...

TD-66779

...

Output home directory is not picked correctly for job runs in wasb/adls-gen2

...

TD-66160

...

SSLHandshakeException when accessing Databricks table

...

TD-66025

...

Glue connection not working on after upgrade to Release 8.2

...

TD-65696

...

In Azure environment, changing the user output/upload directory only persists the path and not the container name/account storage.

...

TD-65331

...

Writing to ADLS failing in SSL Handshake to TLSv1.1

...

TD-65286

...

D s item
itemjobs
fail at Transform stage with Optimizer Service exception

...

TD-65058

...

Unable to upgrade due to migration failure

...

TD-64627

...

D s product
failing due to concurrent DB transaction

...

TD-64528

...

Upgrade to Release 8.2 failed to load dictionaries

...

TD-64281

...

/change-password page fails to load.

...

TD-64171

...

Cannot import parameterized datasets that include files with zero and non-zero byte sizes together.

...

TD-63981

...

Start/stop scripts should not modify any config/database settings during startup.

...

TD-63867

...

Jobs are not triggering for a parameterized datasets with zero-byte file sizes.

...

TD-63493

...

Unable to cancel a plan run

...

TD-60881

...

Incorrect path shown when using parameterized output path

...

TD-59706

...

No vertical scroll when there are too many connections on Import page

...

TD-58576

...

Cannot read property 'expandScriptLines' of undefined when flow node's activeSampleId is pointing to failed (null) sample.

New Known Issues

None.

Release 8.2.1

August 13, 2021

...

EMR:

Databricks:

D s node
:

  • NodeJS upgraded to 14.16.0.


Changes in System Behavior

None.

Key Bug Fixes

TicketDescription
TD-62689

Nginx returns Bad Request Status: 400 error, due to duplicate entries in /etc/nginx/conf.d/trifacta.conf for:

Code Block
proxy_set_header Host $host;


Tip

Tip: Workaround is to delete the second entry in the file manually.


New Known Issues

None.

Release 8.2

June 11, 2021

What's New

Preferences:

  • Re-organized user account, preferences, and storage settings to streamline the setup process. See Preferences Page.

API:

Connectivity:

...


Improved accessibility of job results:


The Jobs tabs have been enhanced to display the list of latest and the previous jobs that have been executed for the selected output.

...

You can monitor the status of all sample jobs that you have generated. Project administrators can access all sample jobs in the workspace. For more information, see Sample Jobs Page.

Users and groups:

The AD users and groups integration is now generally available. See Configure Users and Groups.

Install:

Support for Nginx 1.20.0 on the

D s node
. See System Requirements.


Changes in System Behavior


Java service classpath changes:

...

TicketDescription
TD-59854Datetime column from Parquet file incorrectly inferred to the wrong data type on import.
TD-59658IAM roles passed through SAML does not update after Hotfix upgrade
TD-59633Enabled session tag feature but running into "The security token included in the request is invalid" error
TD-59331When include quotes option is disabled on an output, Databricks still places quotes around empty values.
TD-59128BOM characters at the beginning of a file causing multiple headers to appear in Transformer Page.
TD-58932Cannot read file paths with colons from EMR Spark jobs
TD-58694Very large number of files generated during Spark job execution
TD-58523Cannot import dataset with filename in Korean alphabet from HDFS.

New Known Issues

TicketDescription
TD-60701Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format.

Release 8.1

February 26, 2021

What's New


Tip

In-app messaging: Be sure to check out the new in-app messaging feature, which allows us to share new features and relevant content to 

D s product
 users in your workspace. The user messaging feature can be disabled by workspace administrators if necessary. See Workspace Settings Page.



Install:

  • Support for PostgreSQL 12.X for 

    D s item
    itemdatabases
     on all supported operating systems.

    Info

    NOTE: Beginning in this release, the latest stable release of PostgreSQL 12 can be installed with the

    D s platform
    . Earlier versions of PostgreSQL 12.X can be installed manually.


    Info

    NOTE: Support for PostgreSQL 9.6 is deprecated for customer-managed Hadoop-based deployments and AWS deployments. PostgreSQL 9.6 is supported only for Azure deployments. When Azure supports PostgreSQL 12 or later, support for PostgreSQL 9.6 will be deprecated in the subsequent release of

    D s product
    .

...

  • Define permissions on individual objects when they are shared.

    Info

    NOTE: Fine-grained sharing permissions apply to flows and connections only.

    For more information, see Changes to User Management.

API:

  • Apply job-level overrides to AWS Databricks or Azure Databricks job executions via API. See API Workflow - Run Job.

...

  • Customize connection types (connectors) to ensure consistency across all connections of the same type and to meet your enterprise requirements. For more information, see Changes to the APIs.


Running environment:



Publishing:

Macro updates:


You can replace an existing macro definition with a macro that you have exported to your local desktop.

Info

NOTE: Before you replace the existing macro, you must export a macro to your local desktop. For more information, see Export Macro.

...

You can monitor the status of all sample jobs that you have generated. Project administrators can access all sample jobs in the workspace. For more information, see Sample Jobs Page.

Specify column headers during import


You can specify the column headers for your dataset during import. For more information, see Import Data Page.

...


Changes in System Behavior


Info

NOTE: CDH 6.1 is no longer supported. Please upgrade to the latest supported version. For more information, see Product Support Matrix.


Info

NOTE: HDP 2.6 is no longer supported. Please upgrade to the latest supported version. For more information, see Product Support Matrix.



Support for custom data types based on dictionary files to be deprecated:

Info

NOTE: The ability to upload dictionary files and use their contents to define custom data types is scheduled for deprecation in a future release. This feature is limited and inflexible. Until an improved feature can be released, please consider using workarounds. For more information, see Validate Your Data.

You can create custom data types using regular expressions. For more information, see Create Custom Data Types.

...

Installation of database client is now required:

Before you install or upgrade the database or perform any required database cross-migrations, you must install the appropriate database client first.

Info

NOTE: Use of the database client provided with each supported database distribution is now a required part of any installation or upgrade of the

D s platform
.

For more information: 

Job logs collected asynchronously for Databricks jobs:

...

Integrations between

D s product
and Alation and Waterline services are now deprecated. For more information, see End of Life and Deprecated Features.


Key Bug Fixes

TicketDescription
TD-56170The Test Connection button for some relational connection types does not perform a test authentication of user credentials.
TD-54440

Header sizes at intermediate nodes for JDBC queries cannot be larger than 16K.

Previously, the column names for JDBC data sources were passed as part of a header in a GET request. For very wide datasets, these GET requests often exceeded 16K in size, which represented a security risk.

The solution is to turn these GET requests into ingestion jobs.

Info

NOTE: To mitigate this issue, JDBC ingestion and JDBC long loading must be enabled in your environment. For more information, see Configure JDBC Ingestion.



New Known Issues

TicketDescription
TD-58818

Cannot run jobs on some builds HDP 2.6.5 and later. There is a known incompatibility between HDP 2.6.5.307-2 and later and the Hadoop bundle JARs that are shipped with the

D s item
iteminstaller
.

Tip

Solution: The solution is to use an earlier compatible version. For more information, see Configure for Hortonworks.


TD-58523

Cannot import dataset with filename in Korean alphabet from HDFS.

Tip

Workaround: You can upload files with Korean characters from your desktop. You can also add a 1 to the end of the file on HDFS, and it can then be imported.


TD-55299

Imported datasets with encodings other than UTF-8 and line delimiters other than \n may generate empty outputs on Spark or

D s dataflow
running environments.

TD-51516

Input data containing BOM (byte order mark) characters may cause Spark or

D s dataflow
running environments to read data improperly and/or generate invalid results.

Release 8.0

January 26, 2021

What's New

APIs:

  • Individual workspace users can be permitted to create and use their own access tokens for use with the REST APIs. For more information, see Workspace Settings Page.

...

Import:


  • Improved method for conversion and ingestion of XLS/XSLX files. For more information, see Import Excel Data.

...

  • The Flag for Review feature enables you to set review checkpoints in your recipes. You can flag recipe steps for review by other collaborators for review and approval. For more information, see Flag for Review.

Update Macros:

  • Replace / overwrite an existing macro's steps and inputs with a newly created macro.
  • Map new macro parameters to the existing parameters before replacing.
  • Edit macro input names and default values as needed. 


Job execution:

  • You can enable the
    D s webapp
    to apply SQL filter pushdowns to your relational datasources to remove unused rows before their data is imported for a job execution. This optimization can significantly improve performance as less data is transferred during the job run. For more information, see Flow Optimization Settings Dialog.
  • Optimizations that were applied during the job run now appear in the Job Details Page. See Job Details Page.


Changes in System Behavior

None.

Key Bug Fixes

TicketDescription
TD-57354

Cannot import data from Azure Databricks. This issue is caused by an incompatibility between TLS v1.3 and Java 8, to which it was backported.

TD-57180

AWS jobs run on Photon to publish to HYPER format fail during file conversion or writing.

New Known Issues

TicketDescription
TD-56170

The Test Connection button for some relational connection types does not perform a test authentication of user credentials.

Tip

Workaround: Append the following to your Connect String Options:

Code Block
;ConnectOnOpen=true

This option forces the connection to validate user credentials as part of the connection. There may be a performance penalty when this option is used.



Release 7.10

December 21, 2020

What's New

Tip

Tip: Check out the new in-app tours, which walk you through the steps of wrangling your datasets into clean, actionable data.

...

Language:


...

  • NOTE: This feature can be changed or removed from the platform at any time without notice. Do not deploy it in a production environment.
  • For more information, see API Workflow - Wrangle Output to Python.


Changes in System Behavior


Rebuild custom UDF JARs for Databricks clusters

...

For more information, see Improvements to the Type System.


Key Bug Fixes

TicketDescription
TD-54742Access to S3 is disabled after upgrade.
TD-53527When importing a dataset via API that is sourced from a BZIP file stored on S3, the columns may not be properly split when the platform is permitted to detect the structure.

New Known Issues

TicketDescription
TD-57180

AWS jobs run on Photon to publish to HYPER format fail during file conversion or writing.

Tip

Workaround: Run the job on the Spark running environment instead.


TD-56830

Receive malformed_query: enter a filter criterion when importing table from Salesforce.

Info

NOTE: Some Salesforce tables require mandatory filters when they are queried. Mandatory filters are not currently supported for Salesforce connections.


Release 7.9

November 16, 2020

What's New

Plan View:

  • Execute Plan using status rules: Starting in Release 7.9, you can execute tasks based on the previous task execution result. For more information, see Create a Plan.
  • Execute Parallel Plan tasks: In previous releases, plans were limited to a sequential order of task execution. Beginning in Release 7.9, you can create branches in the graph into separate parallel nodes, enabling the corresponding tasks to run in parallel. This feature enables you to have a greater level of control of your plans' workflows. For more information, see Create a Plan.
  • Zoom options: Zoom control options and keyboard shortcuts have been introduced in the plan canvas. For more information, see Plan View Page.
  • Filter Plan Runs: Filter your plan runs based on dates or plan types. For more information, see Plan Runs Page

...

  • An All option has been added for selecting columns in the Transform Builder.  For more information, see Changes to the Language page.

Changes in System Behavior

Manage Users section has been deprecated:

...

  • The dependencies tab is renamed as dependency graph tab.
  • The old flow view in the dependency graph tab is replaced with the new flow view. For more information, see Job Details Page

Key Bug Fixes

TicketDescription
TD-55125Cannot copy flow. However, export and import of the flow enables copying.
TD-53475Missing associated artifact error when importing a flow.

New Known Issues

None.

Release 7.8

October 19, 2020

What's New


Plans:



  • The viewport position and zoom level are now preserved when returning to a given flow.


Publishing:

  • Improved performance when publishing to Tableau Server.
  • Configure publishing chunk sizes as needed. For more information, see Configure Data Service.

Language:


  • Rename columns now supports uppercase or lowercase characters or shorten column names to a specified character length from the left or right. For more information, see Changes to the Language.


Connectivity:


Changes in System Behavior

JDBC connection pooling disabled:

...

Enhanced  Flow and Flow View menu options:

The context menu options for Flow View and Flow have been renamed and reorganized for a better user experience.

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-54030

When creating custom datasets from Snowflake, columns containing time zone data are rendered as null values in visual profiles, and publishing back to Snowflake fails.

Tip

Workaround: In your SELECT statement applied to a Snowflake database, references to time zone-based data must be wrapped in a function to convert it to UTC time zone. For more information, see Create Dataset with SQL.


Release 7.7

September 21. 2020

What's New

Flow View:

  • Automatically organize the nodes of your flow with a single click. See Flow View Page.

Changes in System Behavior

Deprecated Parameter History Panel Feature 

...

For more information, see Flow View Page.

Key Bug Fixes

TicketDescription
TD-53318
Cannot publish results to relational targets when flow name or output filename or table name contains a hyphen (e.g. my - filename.csv).

New Known Issues

None.