Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

D toc

Release 6.8

...

February 7, 2020

This release enables some new features and makes some relational connections generally available. 

What's New

Install:

...

Info

NOTE: Support for CDH 6.0 has been deprecated. See End of Life and Deprecated Features.

Import:

...

D beta
Info

NOTE: This feature must be enabled.

See Import PDF Data.

...

LDAP:

Cluster Clean:

...

Changes to System Behavior

D s deskapp
:

Info

NOTE: In a future release, the

D s deskapp
will be deprecated. Please switch to a supported version of Google Chrome or Mozilla Firefox. Support for Edge Chromium is expected in a future release. See Desktop Requirements.

CLI and v3 endpoints (Release 6.4):

Info

NOTE: Do not attempt to connect to the

D s platform
using any version of the CLI or the v3 endpoints. They are no longer supported and unlikely to work.

In Release 6.4:

  • The Command Line Interface (CLI) was deprecated. Customers must use the v4 API endpoints instead.
  • The v3 versions of the API endpoints were deprecated. Customers must use the v4 API endpoints instead.
  • Developer content was provided to assist in migrating to the v4 API endpoints. 
  • For more information on acquiring this content, please contact 
    D s support
    .

General availability:

  • The following relational connections are now generally available:
    • DB2 (import only)
    • Salesforce (import only)
    • Tableau Server (publish only)
      For more information, see Connection Types.

Key Bug Fixes

...

Publishing to Databricks Tables fails on ADLS Gen1 in user mode.

New Known Issues

...

Importing an exported flow that references a Google Sheets or Excel source breaks connection to input source.

...

Release 6.8

December 6, 2019

Welcome to Release 6.8 of 

D s product
rtrue
. This release introduces several key features around operationalizing the platform across the enterprise. Enterprise stakeholders can now receive email notifications when recurring jobs have succeeded or failed, updating data consumers outside of the platform. This release also introduces a generalized webhook interface, which facilitates push notifications to applications such as Slack when jobs have completed. When jobs fail, users can download a much richer support bundle containing configuration files, script files, and a specified set of log files. 

...

In the application, you can now use shortcut keys to navigate around the workspace and the Transformer page. And support for the Firefox browser has arrived. Read on for more goodness added with this release.

What's New

Install:

Workspace:


  • Individual users can now enable or disable keyboard shortcuts in the workspace or Transformer page. See User Profile Page.
  • Configure locale settings at the workspace or user level. See Locale Settings.
  • You can optionally duplicate the datasets from a source flow when you create a copy of it. See Flow View Page.
  • Create a copy of your imported dataset. See Library Page.

Browser:

...

...

  • For supported browsers, at the time of release, the latest stable version and the two previous stable versions are supported.

    Info

    NOTE: Stable browser versions released after a given release of

    D s product
    will NOT be supported for any prior version of
    D s product
    .  A best effort will be made to support newer versions released during the support lifecycle of the release.

    For more information, see Desktop Requirements.

Install:

Info

NOTE: In the next release of

D s product
after Release 6.8, support for installation on CentOS/RHEL 6.x and Ubuntu 14.04 will be deprecated. You should upgrade the
D s node
to a supported version of CentOS/RHEL 7.x or Ubuntu 16.04. Before performing the upgrade, please perform a full backup of the
D s platform
and its databases. See Backup and Recovery.



  • Support for Spark 2.1 has been deprecated. Please upgrade to a supported version of Spark.
    • Support for EMR 5.6 and eMR 5.7 has also been deprecated. Please upgrade to a supported version of EMR. 
    • For more information, see Product Support Matrix.
  • To simplify the installation distribution, the Hadoop dependencies for the recommended version only are included in the software download. For the dependencies for other supported Hadoop distributions, you must download them from the 
    D s item
    itemFTP site
     and install them on the 
    D s node
    . See Install Hadoop Dependencies.  
  • D s node
     has been upgraded to use Python 3. This instance of Python has no dependencies on any Python version external to the 
    D s node
    .

...

  • The Command Line Interface (CLI) was deprecated. Customers must use the v4 API endpoints instead.
  • The v3 versions of the API endpoints were deprecated. Customers must use the v4 API endpoints instead.
  • Developer content was provided to assist in migrating to the v4 API endpoints. 
  • For more information on acquiring this content, please contact
    D s support
    .

Key Bug Fixes

TicketDescription
TD-40348

When loading a recipe in an imported flow that references an imported Excel dataset, Transformer page displays Input validation failed: (Cannot read property 'filter' of undefined) error, and the screen is blank.

TD-42080Cannot run flow or deployment that contains more than 10 recipe jobs


New Known Issues

Cannot modify the type of relational target for publishing action.

Workaround: Create a new publishing action with the desired relational target. Remove the original one if necessary. See Run Job Page.
TicketDescriptionTD-46123
Tip
TD-45923Publishing a compressed Snappy file to SFTP fails.
TD-45922You cannot publish TDE format to SFTP destinations.
TD-45492

Publishing to Databricks Tables fails on ADLS Gen1 in user mode.

TD-45273

Artifact Storage Service fails to start on HDP 3.1.

Tip

Workaround: The Artifact Storage Service can reference the HDP 2.6 Hadoop bundle JAR.

  1. D s config
  2. Locate the following property:

    Code Block
    "artifact-storage-service.classpath"


  3. Replace this value:

    Code Block
    :%(topOfTree)s/%(hadoopBundleJar)s


  4. With the following:

    Code Block
    :%(topOfTree)s/conf/hadoop-site/:%(topOfTree)s/hadoop-deps/hdp-2.6/build/libs/hdp-2.6-bundle.jar


  5. Save changes and restart the platform.


TD-45122

API: re-running job using only the wrangleDataset identifier fails even if the original job succeeds when writeSettings were specified.

Tip

Workaround: Use a full jobGroups job specification each time that you run a job.

See API JobGroups Create v4.

TD-44429

Cannot publish outputs to relational targets, receiving Encountered error while processing stream.

Tip

Workaround: This issue may be caused by the trifacta service account not having write and execute permissions to the /tmp directory on the

D s node
. If so, you can do either of the following:

  1. Enable write and execute permissions for the account on /tmp.
  2. Create a new temporary account and provide the service account write and execute permissions to it. Then, add the following to data-service.jvmOptions:

    Code Block
    -Dorg.xerial.snappy.tempdir=/new/directory/with/writeexecuteaccess



TD-44427

Cannot publish dataset containing duplicate rows to Teradata. Error message:

Code Block
Caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.10.00.14] [Error -2802] [SQLState 23000] Duplicate row error in abc_trifacta.tmp_218768523.
 at 


Tip

Workaround: This is a known limitation on Teradata. For more information on this limitation, see Enable Teradata Connections.


...