Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version r0642

D toc

Release 6.4.2

November 15, 2019

This release is primarily a bug fix release with the following new features.

What's New

API:

  • Apply overrides at time of job execution via API. 
  • Define import mapping rules for your deployments that use relational sources or publish to relational targets. 
  • See Changes to the APIs.

Job execution:

  • By default, the 
    D s webapp
     permits up to four jobs from the same flow to be executed at the same time. If needed, you can configure the application to execute jobs from the same flow one at a time. See Configure Application Limits.

Changes in System Behavior

None.

Key Bug Fixes

TicketDescription
TD-44548RANGE function returns null values if more than 1000 values in output.
TD-44494Lists are not correctly updated in Deployment mode
TD-44311Out of memory error when running a flow with many output objects
TD-44188Performance is poor for SQL DW connection
TD-43877Preview after a DATEFORMAT step does not agree with results or profile values
TD-44035Spark job failure from Excel source
TD-43849

Export flows are broken when recipe includes Standardization or Transform by Example tasks.

D s advfeature

New Known Issues

None.

Release 6.4.1

August 30. 2019

This feature release includes bug fixes and introduces SSO connections for Azure relational sources.

What's New

Connectivity:

Changes in System Behavior

Configuration changes:

  • The parameter to enable custom SQL query has been moved to the Workspace Admin page. 
  • The parameter to disable schematized output has been moved to the Workspace Admin page.
  • For more information, see Changes to Configuration.

Key Bug Fixes

TicketDescription
TD-39086Hive ingest job fails on Microsoft Azure.

New Known Issues

None.

Release 6.4

August 1, 2019

This release of 

D s product
rtrue
 features broad improvements to the recipe development experience, including multi-step operations and improved copied and paste within the Recipe panel. As a result of the panel's redesign, you can now create user-defined macros, which are sets of sequenced and parameterized steps for easy reuse and adaptation for other recipes. When jobs are executed, detailed monitoring provides enhanced information on progress of the job through each phase of the process. You can also connect to a broader ecosystem of sources and targets, including enhancements to the integration with Tableau Server and AWS Glue. New for this release: read from your Snowflake sources. Read on for additional details on new features and enhancements.

What's New

Transformer Page:

  • The redesigned Recipe panel enables multi-step operations and more robust copy and paste actions. See Recipe Panel.
  • Introducing user-defined macros, which enable saving and reusing sequences of steps. For more information, see Overview of Macros.
  • Transform by example output values for a column of values. See Transformation by Example Page.
  • Browse current flow for datasets or recipes to join into the current recipe. See Join Panel.
  • Replace specific cell values. See Replace Cell Values.

...

  • New databases:

    • Job Metadata Service database

Changes in System Behavior

Info

NOTE: The

D s item
itemsoftware
must now be installed on an edge node of the cluster. Existing customers who cannot migrate to an edge node will be supported. You will be required to update cluster files on the
D s node
whenever they change, and cluster upgrades may be more complicated. You should migrate your installation to an edge node if possible. For more information, see System Requirements.

Info

NOTE: The v3 APIs are no longer supported. Please migrate immediately to using the v4 APIs. For more information, see API Migration to v4.

Info

NOTE: The command line interface (CLI) is no longer available. Please migrate immediately to using the v4 APIs. For more information, see CLI Migration to APIs.


Info

NOTE: The PNaCl browser client extension is no longer supported. Please verify that all users of

D s product
are using a supported version of Google Chrome, which automatically enables use of WebAssembly. For more information, see Desktop Requirements.

...

Info

NOTE: The Chat with us feature is no longer available. For

D s product
customers, this feature had to be enabled in the product. For more information, see
D s support
.

Info

NOTE: The desktop version of Trifacta Wrangler will cease operations on August 31, 2019. If you are still using the product at that time, your data will be lost. Please transition to using the free Cloud version of

D s product
productss
rtrue
. Automated migration is not available. To register for a free account, please visit https://cloud.trifacta.com.


Workspace:

...

  • In prior releases, the documentation listed UTF32-BE and UTF32-LE as supported file formats. These formats are not supported. Documentation has been updated to correct this error. See Supported File Encoding Types

Key Bug Fixes

TicketDescription
TD-41260

Unable to append

D s item
itemDecimal type
into table with Hive Float type. See Hive Data Type Conversions.

TD-40424

UTF-32BE and UTF-32LE are available as supported file encoding options. They do not work.

Info

NOTE: Although these options are available in the application, they have never been supported in the underlying platform. They have been removed from the interface.

TD-40299Cloudera Navigator integration cannot locate the database name for JDBC sources on Hive.
TD-40243API access tokens don't work with native SAML SSO authentication
TD-39513Import of folder of Excel files as parameterized dataset only imports the first file, and sampling may fail. 
TD-39455

HDI 3.6 is not compatible with Guava 26.

TD-39092

$filepath and $sourcerownumber references are not supported for Parquet file inputs.

For more information, see Source Metadata References.

TD-31354When creating Tableau Server connections, the Test Connection button is missing. See Create Tableau Server Connections.
TD-36145Spark running environment recognizes numeric values preceded by + as Integer or Decimal data type. Photon running environment does not and types these values as strings.


New Known Issues

TicketDescription
TD-42638

Publishing and ingest jobs that are short in duration cannot be canceled.

Tip

Workaround: Allow the job to complete. You can track the progress through these phases of the jobs through the application. See Job Details Page.

TD-39052

Changes to signout on reverse proxy method of SSO do not take effect after upgrade.