Page tree

Release 8.7



Contents:


   

Contents:


Release 8.7

November 12, 2021

What's New


Install:

Databases:

  • Support for PostgreSQL 11 on Azure.

    NOTE: PostgreSQL 12 is not supported in Azure at this time. Please install PostgreSQL 11. You can modify the installation commands for PostgreSQL 12 referenced in the database installation documentation to use PostgreSQL 11. For more information, see Install Databases.

  • Support for PostgreSQL 12 for all other deployments.

    NOTE: Support for PostgreSQL 9.6 has been deprecated. Unless you are installing the Trifacta platform in Azure, please install PostgreSQL 12. Azure deployments require PostgreSQL 11 for this release.

  • See Product Support Matrix.

Browsers:


  • Update to supported browsers:
    • Mozilla Firefox is generally supported.
    • Microsoft Edge is now supported.

      NOTE: This feature is in Beta release.
    • New versions of supported browsers are now supported.
    • For more information, see Browser Requirements.


Plans:

  • Create plan tasks to deliver messages to a specified Slack channel.

    For more information, see Create Slack Task.

Single Sign-On:

  • If using SAML integration, you can now configure the security algorithm for the Trifacta application to use. See Configure SSO for SAML.


Connectivity:

  • Access to S3 is now managed using the native AWS SDK.

Connectivity:

  • Append Trifacta user identifiers to SQL queries to enable auditing through your database logs. For more information, see Configure Connectivity.

Parameterization:

Recipe panel:



Azure:

  • Support for installation on Azure Gov Cloud.

    NOTE: The Azure environment must be set to US_GOV.

    For more information, see Configure for Azure.

Databricks:

  • Support for Databricks 7.x and 8.x.

    NOTE: Databricks 7.3 and Databricks 8.3 are recommended.

  • Support for Databricks cluster creation via cluster policies.
  • Store a user-defined set of secret information such as credentials in Databricks Secrets.

Performance:

  • Improved performance when previewing or performing quick scan samples on Parquet files.

    NOTE: This feature may require enabling in your environment.

    See Configure Photon Running Environment.

Logging:

Trifacta node:

  • NodeJS upgraded to 14.17.5.


Changes in System Behavior


Publishing:

Improvements to publishing of Trifacta Date values to Snowflake. For more information, see Improvements to the Type System.


Deprecated


HDInsight no longer supported:

  • HDInsight 3.5 and 3.6 are no longer supported.

HDP 3.0 deprecated:

  • Please upgrade to HDP 3.1.

    NOTE: In a future release, support for Hortonworks Data Platform (HDP) will be deprecated. Please migrate to using a different supported running environment. For more information, see Product Support Matrix.

API:

  • Deprecated API endpoint to transfer assets between users has been removed from the platform. This endpoint was previously replaced by an improved method of transfer.
  • Some connection-related endpoints have been deprecated. These endpoints have little value for public use.
  • For more information, see Changes to the APIs.


Key Bug Fixes

TicketDescription
TD-65753Some platform services do not bind to localhost only.
TD-65502Datasets from parameters are improperly being permitted to be referenced in recipes and returns an error during job execution.
TD-63953HYPER format publishing jobs remain in queued state on Azure.
TD-63564

Schedules created by a flow collaborator with editor access stop working if the collaborator is removed from the flow.

Collaborators with viewer access cannot create schedules.

New Known Issues

TicketDescription
TD-63974

In imported datasets sourced from CSV files, double quotes that are escaped with a backslash ( \"backslash-escaped value\") can cause remainder of row to be compressed into a single cell.

TD-63517

Unpivoting a String column preserves null values in Spark but converts them to empty strings in Photon. Running jobs on the different running environments generates different results.

Workaround: After the unpivot step, you can add an Edit with formula step. Set the columns to all of the columns in the unpivot and add the following formula, which converts all missing values to null values:

if(ismissing($col),NULL(),$col)

Release 8.6

August 2, 2021

What's New


Collaboration:


Databricks:

Performance:


  • Conversion jobs are now processed asynchronously. 

  • Better management of file locking and concurrency during job execution. 

    This feature must be enabled in your environment and requires installation of Redis server. See Configure for Redis.


Job execution:

Better Handling of JSON files

The Trifacta application now supports the regularly formatted JSON files during import. You can now import flat JSON records contained in a single array object. With this, each array is treated as a single line and imported as a new row. For more information, see Working with JSON v2.


Changes in System Behavior 


Compression scheme for imported files can now inferred

Prior to this release, when a file was imported, the Trifacta application inferred any compression applied to the file based on the filename extension at the end of the file. For example, if the filename ended with .gz, then the file was passed through the internal code for decompressing GZIP files. 

Beginning in this release, the Trifacta application can be configured to detect the applied compression based on reading in the first few bytes of the file. Based on the data signature, the application passes the file for decompression to the appropriate code.

For more information on enabling, see Miscellaneous Configuration.


Deprecated

None.

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-63564

Schedules created by a flow collaborator with editor access stop working if the collaborator is removed from the flow.

Tip: Flow owners can delete the schedule and create a new one. When this issue is fixed, the original schedule will continue to be executed under the flow owner's account.

Collaborators with viewer access cannot create schedules.

Release 8.5

June 28, 2021

What's New


Parameterization:

  • Create environment parameters to ensure that all users of the project or workspace use consistent references.

    NOTE: You must be a workspace administrator or project owner to create environment parameters.

    Tip: Environment parameters can be exported from one project or workspace and imported into another, so that these references are consistent across the enterprise.

  • Parameterize names of your storage buckets using environment parameters.



Flow View:


Job execution:

  • Define SQL scripts to execute before data ingestion or after publication for file-based or table-based jobs.


Connectivity:

  • Connect to your relational database systems hosted on Amazon RDS. In the Connections page, click the Amazon RDS card for your connection type.

    For more information, see Create Connection Window.



Developer:

Download and install the Python SDK, which enables you to leverage the visual tools of the Trifacta application to transform data in your existing Python pipelines.

NOTE: This is an Alpha release. Do not deploy the Python SDK in production environments.

For more information, see Python SDK.


Job execution:

You can choose to ignore the recipe errors before job execution and then review any errors in the recipe through the Job Details page.

Language:


  • NUMVALUE function can be used to convert a String value formatted as a number into an Integer or Decimal value.
  • NUMFORMAT function now supports configurable grouping and decimal separators for localizing numeric values.
  • For more information, see Changes to the Language.

Changes in System Behavior

None.

Key Bug Fixes

None.

New Known Issues

None.

Release 8.4

May 24, 2021

What's New




Collaboration:

  • You can receive email notifications whenever a plan or a flow is shared with you by the owner.


Language:


Changes in System Behavior

None.

Key Bug Fixes

TicketDescription
TD-60881Incorrect file path and missing file extension in the application for parameterized outputs
TD-60382

Date format M/d/yy is handled differently by PARSEDATE function on Trifacta Photon and Spark.

New Known Issues

None.

Release 8.3

April 26, 2021

What's New


Spark:

EMR:



Connectivity:

You can publish results to external S3 buckets through an Access key and Secret key. For more information, see External S3 Connections.


Job execution:

Introducing new filter pushdowns to optimize the performance of your flows during job execution. For more information, see Flow Optimization Settings Dialog.


Job results:

You can now preview job results and download them from the Overview tab of the Job details page. For more information, see Job Details Page.

Tip: You can also preview job results in Flow View. See View for Outputs.



Changes in System Behavior

Improved method of JSON import

Beginning in this release, the Trifacta application now uses the conversion service to ingest JSON files during import. This improved method of ingestion can save significant time wrangling JSON into records.

NOTE: The new method of JSON import is enabled by default but can be disabled as needed.

For more information, see Working with JSON v2.

Flows that use imported datasets created using the old method continue to work without modification.

NOTE: It is likely that support for the v1 version of JSON import is deprecated in a future release. You should switch to using the new version as soon as possible. For more information on migrating your flows and datasets to use the new version, see Working with JSON v1.

Future work on support for JSON is targeted for the v2 version only.For more information on using the old version and migrating to the new version, see Working with JSON v1.

Key Bug Fixes

TicketDescription
TD-60701Most non-ASCII characters incorrectly represented in visual profile downloaded in PDF format
TD-60660Azure SSO redirects to the Home page instead of the target page after login.
TD-59854Datetime column from Parquet file incorrectly inferred to the wrong data type on import.


New Known Issues

None.

This page has no comments.