Skip to main content

Release Notes 6.4

Release 6.4.2

November 15, 2019

This release is primarily a bug fix release with the following new features.

What's New


  • Apply overrides at time of job execution via API.

  • Define import mapping rules for your deployments that use relational sources or publish to relational targets.

  • See Changes to the APIs.

Job execution:

  • By default, the Trifacta Application permits up to four jobs from the same flow to be executed at the same time. If needed, you can configure the application to execute jobs from the same flow one at a time. See Configure Application Limits.

Changes in System Behavior


Key Bug Fixes




RANGE function returns null values if more than 1000 values in output.


Lists are not correctly updated in Deployment mode


Out of memory error when running a flow with many output objects


Performance is poor for SQL DW connection


Preview after a DATEFORMAT step does not agree with results or profile values


Spark job failure from Excel source


Export flows are broken when recipe includes Standardization or Transform by Example tasks.


This Advanced Feature is available in Designer Cloud Powered by Trifacta Enterprise Edition under a separate, additional license. If it is not available under your current license, do not enable it for use. Please feel free to contact your representative.

New Known Issues




Stepping backward to an early step in a recipe sometimes fails to properly update the state of the quality bar and histograms in the data grid.


This issue is caused by caching of snapshot profiles from the data grid. The workaround is to reload the page through the browser.

Release 6.4.1

August 30. 2019This release includes bug fixes and introduces SSO connections for Azure relational sources.

What's New


Changes in System Behavior

Configuration changes:

  • The parameter to enable custom SQL query has been moved to the Workspace Settings page.

  • The parameter to disable schematized output has been moved to the Workspace Settings page.

  • For more information, see Changes to Configuration.

Key Bug Fixes




Hive ingest job fails on Microsoft Azure.

New Known Issues


Release 6.4

August 1, 2019

This release of Designer Cloud Powered by Trifacta Enterprise Edition features broad improvements to the recipe development experience, including multi-step operations and improved copied and paste within the Recipe panel. As a result of the panel's redesign, you can now create user-defined macros, which are sets of sequenced and parameterized steps for easy reuse and adaptation for other recipes. When jobs are executed, detailed monitoring provides enhanced information on progress of the job through each phase of the process. You can also connect to a broader ecosystem of sources and targets, including enhancements to the integration with Tableau Server and AWS Glue. New for this release: read from your Snowflake sources. Read on for additional details on new features and enhancements.

What's New

Transformer Page:

  • The redesigned Recipe panel enables multi-step operations and more robust copy and paste actions. See Recipe Panel.

  • Introducing user-defined macros, which enable saving and reusing sequences of steps. For more information, see Overview of Macros.

  • Transform by example output values for a column of values. See Transformation by Example Page.

  • Browse current flow for datasets or recipes to join into the current recipe. See Join Window.

  • Replace specific cell values. See Replace Cell Values.

Job Execution:


  • Support for deployment of Designer Cloud Powered by Trifacta platform via Docker image. See Install for Docker.


  • Support for Snowflake database connections.


    This feature is supported only when Designer Cloud Powered by Trifacta Enterprise Edition is installed on customer-managed AWS infrastructure.

    For more information, see Snowflake Access.

Enhanced support for AWS Glue integration:


  • Add timestamp parameters to your custom SQL statements to enable data import relative to the job execution time. See Create Dataset with SQL.


  • Leverage your enterprise's SAML identity provider to pass through a set of IAM roles that Alteryx users can select for access to AWS resources.


    This authentication method is supported only if SSO authentication has been enabled using the platform-native SAML authentication method. For more information, see Configure SSO for SAML.

    For more information, see Configure for AWS SAML Passthrough Authentication.


  • Administrators can review, enable, disable, and delete schedules through the application. See Schedules Page.


  • Share flows and connections with groups of users imported from your LDAP identity provider.


    NOTE: This feature is in Beta release.

    See Configure Users and Groups.



  • New functions. See Changes to the Language.

  • Broader support for metadata references. For Excel files, $filepath references now return the location of the source Excel file. Sheet names are appended to the end of the reference. See Source Metadata References.



  • New databases:

    • Job Metadata Service database

Changes in System Behavior


The Alteryx software must now be installed on an edge node of the cluster. Existing customers who cannot migrate to an edge node will be supported. You will be required to update cluster files on the Trifacta node whenever they change, and cluster upgrades may be more complicated. You should migrate your installation to an edge node if possible. For more information, see System Requirements.


The v3 APIs are no longer supported. Please migrate immediately to using the v4 APIs.


The command line interface (CLI) is no longer available. Please migrate immediately to using the v4 APIs.


The PNaCl browser client extension is no longer supported. Please verify that all users of Designer Cloud Powered by Trifacta Enterprise Edition are using a supported version of Google Chrome, which automatically enables use of WebAssembly. For more information, see Browser Requirements.


Support for Java 7 has been deprecated in the platform. Please upgrade to Java 8 on the Trifacta node and any connected cluster. Some versions of Cloudera may install Java 7 by default.


The Chat with us feature is no longer available. For Designer Cloud Powered by Trifacta Enterprise Edition customers, this feature had to be enabled in the product. For more information, see Alteryx Support.


The desktop version of Desktop Wrangler will cease operations on August 31, 2019. If you are still using the product at that time, your data will be lost. Please transition to using the free Cloud version of Designer Cloud Powered by Trifacta Educational. Automated migration is not available. To register for a free account, please visit



  • The endpoint used to assign an AWSConfig object to a user has been replaced.


    If you used the APIs to assign AWSConfig objects in a previous release, you must update your scripts to assign AWS configurations. For more information, see Changes to the APIs.


  • In prior releases, the documentation listed UTF32-BE and UTF32-LE as supported file formats. These formats are not supported. Documentation has been updated to correct this error. See Supported File Encoding Types.

Key Bug Fixes




Unable to append Alteryx Decimal type into table with Hive Float type. See Hive Data Type Conversions.


UTF-32BE and UTF-32LE are available as supported file encoding options. They do not work.


Although these options are available in the application, they have never been supported in the underlying platform. They have been removed from the interface.


Cloudera Navigator integration cannot locate the database name for JDBC sources on Hive.


API access tokens don't work with native SAML SSO authentication


Import of folder of Excel files as parameterized dataset only imports the first file, and sampling may fail.


HDI 3.6 is not compatible with Guava 26.


$filepath and $sourcerownumber references are not supported for Parquet file inputs.

For more information, see Source Metadata References.


When creating Tableau Server connections, the Test Connection button is missing. See Tableau Server Connections.


Spark running environment recognizes numeric values preceded by + as Integer or Decimal data type. Photon running environment does not and types these values as strings.

New Known Issues




Publishing and ingest jobs that are short in duration cannot be canceled.


Allow the job to complete. You can track the progress through these phases of the jobs through the application. See Job Details Page.


Changes to signout on reverse proxy method of SSO do not take effect after upgrade.