Release Notes 6.8
Release 6.8.2
April 27, 2020
What's New
Import:
Enhanced full-screen interface for importing using custom SQL. See Create Dataset with SQL.
Changes in System Behavior
None.
Key Bug Fixes
Ticket | Description |
---|---|
TD-48245 | By default, under SSO manual logout and session expiration logout redirect to different pages. Manual logout directs you to SAML sign out, and session expiry produces a session expired page. To redirect the user to a different URL on session expiry, an administrator can set the following parameter: |
New Known Issues
Ticket | Description |
---|---|
TD-48630 | Connection files used by the data service are not persisted in a Dockerized environment. Tip In the Admin Settings page, set (Path-to-persistent-directory)/conf/data-service/application.properties |
TD-47696 | Platform appears to fail to restart properly through Admin Settings page due to longer restarts of individual services. Symptoms:
Tip Restart can take up to several minutes. If the restart does not appear to complete, try reloading the page. If that doesn't work, restarting from the command line is more reliable. See Start and Stop the Platform. |
Release 6.8.1
February 7, 2020
This release enables some new features and makes some relational connections generally available.
What's New
Install:
Support for CDH 6.3. See Supported Deployment Scenarios for Cloudera.
Note
Support for CDH 6.0 has been deprecated. See End of Life and Deprecated Features.
Import:
Upload tabular data from PDF documents.
Note
NOTE: This feature is in Beta release.
Note
This feature must be enabled.
See Import PDF Data.
Read support for ORC tables managed through Hive. See Configure for Hive.
LDAP:
Support for initial binding to active directory using the user's account. See Configure SSO for AD-LDAP.
Cluster Clean:
Cluster Clean standardization feature is now available in all product editions. See Overview of Cluster Clean.
Documentation:
API: Improved documentation for the asset transfer endpoint. See Changes to the APIs.
Changes to System Behavior
Wrangler Enterprise desktop application:
Note
In a future release, the Wrangler Enterprise desktop application will be deprecated. Please switch to a supported version of Google Chrome or Mozilla Firefox. Support for Edge Chromium is expected in a future release. See Browser Requirements.
General availability:
The following relational connections are now generally available:
DB2 (import only)
Salesforce (import only)
Tableau Server (publish only)
For more information, see Connection Types.
Key Bug Fixes
Ticket | Description |
---|---|
TD-45492 | Publishing to Databricks Tables fails on ADLS Gen1 in user mode. |
New Known Issues
Ticket | Description |
---|---|
TD-47263 | Importing an exported flow that references a Google Sheets or Excel source breaks connection to input source. Tip If the importing user has access to the source, the user can re-import the dataset and then swap the source for the broken recipe. |
Release 6.8
December 6, 2019
Welcome to Release 6.8 of Designer Cloud Powered by Trifacta Enterprise Edition. This release introduces several key features around operationalizing the platform across the enterprise. Enterprise stakeholders can now receive email notifications when recurring jobs have succeeded or failed, updating data consumers outside of the platform. This release also introduces a generalized webhook interface, which facilitates push notifications to applications such as Slack when jobs have completed. When jobs fail, users can download a much richer support bundle containing configuration files, script files, and a specified set of log files.
Macros have been expanded to now be export- and import-ready across environments. In support of this feature, the Wrangle Exchange is now available through the Alteryx Community, where you can download macros created by others and import them for your own use. Like macros, you can now export and import flows across product editions and release (Release 6.8 or later only).
In the application, you can now use shortcut keys to navigate around the workspace and the Transformer page. And support for the Firefox browser has arrived. Read on for more goodness added with this release.
What's New
Install:
Support for ADLS Gen2 blob storage. See ADLS Gen2 Access.
Workspace:
Individual users can now enable or disable keyboard shortcuts in the workspace or Transformer page. See User Profile Page.
Configure locale settings at the workspace or user level. See Locale Settings.
You can optionally duplicate the datasets from a source flow when you create a copy of it. See Flow View Page.
Create a copy of your imported dataset. See Library Page.
Browser:
Support for Firefox browser.
Note
NOTE: This feature is in Beta release.
For supported versions, see Browser Requirements.
Project Management:
Support for export and import of macros. See Macros Page.
For more information on macros, see Overview of Macros.
Download and use macros available through the Wrangle Exchange. See https://www.trifacta.com/blog/crowdsourcing-macros-trifacta-wrangle-exchange/.
Operationalization:
Create webhook notifications for third-party platforms based on results of your job executions. See Create Flow Webhook Task.
Enable and configure email notifications based on the success or failure of job executions.
Note
This feature requires access to an SMTP server. See Enable SMTP Email Server Integration.
For more information on enabling, see Workspace Settings Page.
Individual users can opt out of receiving email messages or can configure use of a different email address. See Email Notifications Page.
For more information on enabling emails for individual flows, see Manage Flow Notifications Dialog.
Supportability:
Download logs bundle on job success or failure now contains extensive configuration information to assist in debugging. For more information, see Configure Support Bundling.
Connectivity:
Support for integration with EMR 5.8 - 5.27. For more information, see Configure for EMR.
Connect to SFTP servers to read data and write datasets. See SFTP Connections.
Create connections to Databricks Tables.
Note
This connection is supported only when the Designer Cloud Powered by Trifacta platform is connected to an Azure Databricks cluster.
For more information, see Databricks Tables Connections.
Support for using non-default database for your Snowflake stage.
Support for ingest from read-only Snowflake databases.
Import:
As of Release 6.8, you can import an exported flow into any edition or release after the build number of the export. See Import Flow.
Improved monitoring of long-loading relational sources. See Import Data Page.
Note
This feature must be enabled. See Configure JDBC Ingestion.
Transformer Page:
Select columns, functions applied to your source, and constants to replace your current dataset. See Select.
Improved Date/Time format selection. See Choose Datetime Format Dialog.
Tip
Datetime formats in card suggestions now factor in the user's locale settings for greater relevance.
Improved matching logic and performance when matching columns through Target Schema Mapping.
Align column based on the data contained in them, in addition to column name.
This feature is enabled by default. For more information, see Overview of Target Schema Mapping.
Improvements to the Search panel enable faster discovery of transformations, functions, and other objects. See Search Panel.
Job execution:
By default, the Trifacta Application permits up to four jobs from the same flow to be executed at the same time. If needed, you can configure the application to execute jobs from the same flow one at a time. See Configure Application Limits.
If you enabled visual profiling for your job, you can download a JSON version of the visual profile. See Job Details Page.
Support for instance pooling in Azure Databricks. See Configure for Azure Databricks.
Language:
New trigonometry and statistical functions. See Changes to the Language.
API:
Apply overrides at time of job execution via API.
Define import mapping rules for your deployments that use relational sources or publish to relational targets.
Export and import macro definitions.
See Changes to the APIs.
Changes in System Behavior
Browser Support Policy:
For supported browsers, at the time of release, the latest stable version and the two previous stable versions are supported.
Note
Stable browser versions released after a given release of Designer Cloud Powered by Trifacta Enterprise Edition will NOT be supported for any prior version of Designer Cloud Powered by Trifacta Enterprise Edition. A best effort will be made to support newer versions released during the support lifecycle of the release.
For more information, see Browser Requirements.
Install:
Note
In the next release of Designer Cloud Powered by Trifacta Enterprise Edition after Release 6.8, support for installation on CentOS/RHEL 6.x and Ubuntu 14.04 will be deprecated. You should upgrade the Trifacta node to a supported version of CentOS/RHEL 7.x or Ubuntu 16.04. Before performing the upgrade, please perform a full backup of the Designer Cloud Powered by Trifacta platform and its databases. See Backup and Recovery.
Support for Spark 2.1 has been deprecated. Please upgrade to a supported version of Spark.
Support for EMR 5.6 and eMR 5.7 has also been deprecated. Please upgrade to a supported version of EMR.
For more information, see Product Support Matrix.
To simplify the installation distribution, the Hadoop dependencies for the recommended version only are included in the software download. For the dependencies for other supported Hadoop distributions, you must download them from the Alteryx FTP site and install them on the Trifacta node. See Install Hadoop Dependencies
Trifacta node has been upgraded to use Python 3. This instance of Python has no dependencies on any Python version external to the Trifacta node.
Import/Export:
Flows can now be exported and imported across products and versions of products. See Changes to the Object Model.
CLI and v3 endpoints (Release 6.4):
Note
Do not attempt to connect to the Designer Cloud Powered by Trifacta platform using any version of the CLI or the v3 endpoints. They are no longer supported and unlikely to work.
In Release 6.4:
The Command Line Interface (CLI) was deprecated. Customers must use the v4 endpoints for the APIs instead.
The v3 versions of the API endpoints were deprecated. Customers must use the v4 endpoints for the APIs instead.
Developer content was provided to assist in migrating to the v4 endpoints.
For more information on acquiring this content, please contact Alteryx Support.
Key Bug Fixes
Ticket | Description |
---|---|
TD-40348 | When loading a recipe in an imported flow that references an imported Excel dataset, Transformer page displays Input validation failed: (Cannot read property 'filter' of undefined) error, and the screen is blank. |
TD-42080 | Cannot run flow or deployment that contains more than 10 recipe jobs |
New Known Issues
Ticket | Description |
---|---|
TD-46123 | Cannot modify the type of relational target for publishing action. Tip Create a new publishing action with the desired relational target. Remove the original one if necessary. See Run Job Page. |
TD-45923 | Publishing a compressed Snappy file to SFTP fails. |
TD-45922 | You cannot publish TDE format to SFTP destinations. |
TD-45492 | Publishing to Databricks Tables fails on ADLS Gen1 in user mode. |
TD-45273 | Artifact Storage Service fails to start on HDP 3.1. Tip The Artifact Storage Service can reference the HDP 2.6 Hadoop bundle JAR. Steps:
|
TD-45122 | API: re-running job using only the Tip Use a full See https://api.trifacta.com/ee/9.7/index.html#operation/runJobGroup |
TD-44429 | Cannot publish outputs to relational targets, receiving Tip This issue may be caused by the If so, you can do either of the following:
|
TD-44427 | Cannot publish dataset containing duplicate rows to Teradata. Error message: Caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.10.00.14] [Error -2802] [SQLState 23000] Duplicate row error in abc_trifacta.tmp_218768523. at Tip This is a known limitation on Teradata. For more information on this limitation, see Enable Teradata Access. |