Release Notes 7.1
Release 7.1.2
November 25, 2020
What's New
This release provides fixes to key issues.
Publishing:
Improved performance when publishing to Tableau Server.
Configure publishing chunk sizes as needed. For more information, see Configure Data Service.
Changes in System Behavior
None.
Key Bug Fixes
Ticket | Description |
---|---|
TD-55714 | Receiving a 502 Internal Server error when attempting to use to a tested SFTP server connection. Note In this case, the issue relates to how the batch job runner service authenticates to the SFTP server. For more information on configuration options, see SFTP Connections. |
TD-55125 | Cannot copy flow. However, export and import of the flow enables copying. |
TD-53475 |
|
TD-52737 | After it is created, SFTP connection displays a blank page when opening from Import Data page. |
New Known Issues
None.
Release 7.1.1
August 21, 2020
What's New
Support for PostgreSQL 12.3 for Alteryx databases.
Note
For this release, PostgreSQL 12.3 is supported for supported versions of CentOS/RHEL 7 only. See Product Support Matrix.
Note
In a future release, support for PostgreSQL 9.6 will be deprecated. For more information, see Upgrade Databases for PostgreSQL.
Azure Databricks:
Support for configurable Azure AD endpoint and authority for SSO validation. For more information, see Configure SSO for Azure AD.
Changes in System Behavior
Cloudera support:
If you are upgrading your cluster to CDH 6.3.3, please set the following property to the value listed below:
"spark.version": "2.4.cdh6.3.3.plus",
Save your changes and restart the platform. For more information, see Admin Settings Page.
For more information, see Configure for Spark.
Key Bug Fixes
Ticket | Description |
---|---|
TD-53062 | After upgrade, imported recipe has UDF steps converted to comments. |
TD-52738 | On Azure Databricks, creating a stratified sample fails. |
TD-52686 | Cannot run Azure Databricks jobs on ADLS-Gen1 cluster in user mode. |
TD-52614 | UnknownHostException error when generating Azure Databricks access token from Secure Token Service |
TD-51903 | Cannot import some Parquet files into the platform. |
TD-51681 | Import data page is taking too long to load. |
TD-51537 | Closing the connections search bar removes search bar and loses sort order. |
TD-51306 | On upgrade, Spark is incorrectly parsing files of type "UTF-8 Unicode (with BOM)." |
TD-51218 | Import rules not working for remapping of WASB bucket name. For more information, see API Task - Define Deployment Import Mappings. |
TD-51166 | Cannot import flow due to missing associated flownode error. |
TD-50945 | Server Save error when deleting a column. |
TD-50906 | Transformation engine unavailable due to prior crash |
TD-50791 | After upgrade, you cannot edit recipes or run jobs on recipes that contain the optional |
TD-50703 | Optional file cleanup generates confusing error logging when it fails. |
TD-50642 | When modifying file privileges, the platform makes assumptions about database usernames. |
TD-50530 | On upgrade, the migration framework for the authorization service is too brittle for use with Amazon RDS database installations. |
TD-50525 | When flows are imported into the Deployment Manager, additional characters are inserted into parameterized output paths, causing job failures. |
TD-50522 | PostgreSQL connections may experience out of memory errors due to incorrectly specified fetch size and vendor configuration. |
TD-50516 | Can't import a flow that contains a reference in a flow webhook task to a deleted output. |
TD-50508 | Generic Hadoop folder is missing in |
TD-50496 | After upgrade, you cannot publish as a single-file to WASB to replace an existing output destination. |
TD-50495 | After upgrade, users cannot load recipes due to Requested Data Not Found error when loading samples. |
TD-50466 | After upgrading Cloudera cluster to version 6.3.3, you cannot run jobs due to the following error: class not found exception: java.lang.NoClassDefFoundError: org/apache/spark/sql/execution/datasources/csv/CSVOptions Please see "Cloudera support" above. |
TD-50446 | During upgrade, cross-migration fails for authorization service and its database with the following error: Cross migration failed. Make sure the authorization DB is reset. |
TD-50164 | After upgrade, ad-hoc publish to Hive fails. |
TD-49991 | After upgrade, you cannot unzip downloaded log files. |
TD-49973 | After upgrade, cross-migration validation fails for "groupsPolicies." |
TD-49692 | Tripache Vulnerabilities - CVE-2020-1927 |
New Known Issues
Ticket | Description |
---|---|
TD-59854 | Datetime column from Parquet file incorrectly inferred to the wrong data type on import. Tip Use the column drop-down to change the data type to Datetime. |
TD-51229 | When an admin user shares a flow that the admin user owns, a |
Release 7.1
May 4, 2020
What's New
In-app chat:
Tip
Use the new in-app chat feature to explore content or ask a question to our support staff. If you need assistance, please reach out!
Troubleshooting:
Users can download log files related to their current session through the application. See Download Logs Dialog.
Administrators have a separate admin dialog that enables log download by time frame, job identifier, or session identifier. See Admin Download Logs Dialog.
Install:
Note
If you are installing or upgrading a deployment of Designer Cloud Powered by Trifacta Enterprise Edition that uses or will use a remote database service, such as Amazon RDS, for hosting the Alteryx databases, please contact Alteryx Customer Success and Services. For this release, additional configuration may be required.
Support for installation on CentOS/RHEL 8. See System Requirements.
Note
SSO using SAML is not supported on CentOS/RHEL 8. See Configure SSO for SAML.
Note
Support for CentOS/RHEL 6 has been deprecated. Please upgrade to CentOS/RHEL 8.
Support for installation on CentOS/RHEL 7.7. See System Requirements.
Support for EMR 5.28.1 and EMR 5.29.0
Note
EMR 5.28.0 is not supported, due to Spark compatibility issues.
Note
Support for EMR 5.8 - EMR 5.12 is deprecated. For more information, see End of Life and Deprecated Features.
Support for Azure Databricks 6.2. See Configure for Azure Databricks.
Support for installation on Ubuntu 18.04 (Bionic Beaver). See System Requirements.
Note
Support for installation on Ubuntu 14.04 (Trusty) has been deprecated. See End of Life and Deprecated Features.
Support for CDH 6.0 is deprecated. See End of Life and Deprecated Features.
Spark:
Support for Spark 2.2.x versions is deprecated. See End of Life and Deprecated Features.
Improved performance for Spark profiling on Datetime and numeric columns with low number of discrete values.
Kerberos:
Support for access to Kerberized clusters. See Configure for EMR.
Connectivity:
Improved performance for Oracle, SQL Server, and DB2 connections. These performance improvements will be applied to other relational connections in future releases.
Note
For more information on enabling this feature, please contact Alteryx Customer Success and Services.
Azure Databricks Tables:
Support for read/write on Delta tables.
Support for read/write on external tables.
Support for read from partitioned tables.
Note
To enable these additional read/write capabilities through Databricks Tables, the underlying connection was changed to use a Simba driver. In your connection definition, any Connect String Options that relied on the old Hive driver may not work. For more information, see Configure for Azure Databricks.
Import:
Ingestion of large relational datasets is no longer a blocking operation. For more information, see Configure JDBC Ingestion.
Track progress of large-scale ingestion in Flow View and the Library page.
See Flow View Page.
See Import Data Page.
Workspace:
Redesigned Settings and Resources menus. See Home Page.
User settings are now modified through Preferences. See Preferences Page.
Administrators now have a dedicated admin area. See Admin Console.
Plans:
Introducing plans. A plan is a sequence of tasks on one or more flows that can be scheduled.
Note
In this release, the only type of task that is supported is Run Flow.
For more information on plans, see Plans Page.
For more information on orchestration in general, see Overview of Operationalization.
Flow View:
Introducing new Flow View. The Flow View page has been redesigned to improve the user experience and overall productivity.
Note
NOTE: This feature is in Beta release.
Enhancements include:
Drag and drop to reposition objects on the Flow View canvas, and zoom in and out to focus on areas of development.
Perform joins and unions between objects on the Flow View canvas.
Annotate the canvas with notes.
You can toggle between new and classic views through the context menu in the corner of Flow View. See Flow View Page.
As needed, Alteryx administrators can disable access to the new Flow View completely. See Miscellaneous Configuration.
Create flow parameters that you can reference in your flow. Flow parameters can be string literals, Alteryx patterns, or regular expression patterns.
Note
For this release, flow parameters can be applied into your recipes only.
As needed, you can apply overrides to the parameters in your flow or to downstream flows.
Note
Flow parameters do not apply to datasets or output objects, which have their own parameters. However, if you specify an override at the flow level, any parameters within the flow that use the same name receive the override value, including output object parameters and datasets with parameters.
For more information on parameters, see Overview of Parameterization.
Monitor job progress through each phase in the Jobs panel. See Flow View Page.
Transformer Page:
Improved performance when loading the Transformer page and when navigating between the Flow View and Transformer pages.
Join steps are now created in a larger window for more workspace. See Join Window.
New column selection UI simplifies choosing columns in your transformations. See Transform Builder.
Faster and improved method of surfacing transform suggestions based on machine learning.
Job Execution:
Note
Azure Databricks 5.3 and 5.4 are no longer supported. Please upgrade to Azure Databricks 5.5 LTS or 6.x. See End of Life and Deprecated Features.
Apply overrides to Spark properties for individual job execution. See Enable Spark Job Overrides.
Execute jobs from SFTP sources on EMR and Azure Databricks. See SFTP Connections.
Job Details:
When visual profiling is enabled for a job, you can now download your visual profile in PDF format. See Job Details Page.
Publishing:
Support for generating results and publishing to Tableau Hyper format.
Note
Tableau TDE format will be deprecated in a future release. Please switch to using Tableau Hyper format.
If you have upgraded to Tableau Server 10.5 or later, you may have a mix of TDE and Hyper files stored on the server. You can automatically upgrade the TDE files to Hyper, if needed. For more information, see https://help.tableau.com/current/online/en-us/extracting_upgrade.htm.
If you are on Tableau Server 10.5 or later and you append to a TDE file, the file is automatically converted to Hyper format. This conversion cannot be reverted.
Language:
New functions to parse values against specific data types.
New functions for calculating working days between two valid dates.
New two-column statistical functions.
Documentation:
New content on the getting started with sampling. See Sampling Basics.
Feature overview: Overview of Sampling
Best practices: https://community.trifacta.com/s/article/Best-Practices-Managing-Samples-in-Complex-Flows
Changes in System Behavior
Wrangler Enterprise desktop application:
Warning
The Wrangler Enterprise desktop application is no longer available in the software distribution and has been deprecated. Please switch to a supported browser version. For more information, see Browser Requirements.
A Release 6.8 version of the Wrangler Enterprise desktop application can be made available upon request. For more information, please contact Alteryx Support.
Authorization:
All Alteryx admin users are now workspace admins.
All workspace admins now have access to all user-created objects within the workspace.
Note
Workspace administrators can access some types of user-created objects in the workspace with the same level of access as the object owner. Under some conditions, workspace admins may have access to source datasets and generated results. See Workspace Admin Permissions.
For more information, see Changes to User Management.
API Documentation:
API reference documentation is now available directly through the application. This release includes more supported endpoints and documented options. To access, select Resources menu > API documentation.
Note
API reference content is no longer available with the product documentation. Please use the in-app reference documentation instead.
Task documentation is still available with the product documentation. For more information, see API Reference.
For details, see Changes to the APIs.
Trifacta node:
Upgrade to NodeJS 12.16.1.
Note
This dependency is specific to the Designer Cloud Powered by Trifacta platform. For this release, a separate installation of Alteryx dependencies is required for installing or upgrading the platform.
See Install on CentOS and RHEL.
See Install on Ubuntu.
See System Requirements.
See System Dependencies.
APIs:
The v3 version of the API endpoints are no longer available in the platform. You must use v4 endpoints. See API Reference.
Simplified connections endpoints.
The format of the supported WASB URIs has changed.
Note
If you were using the APIs to interact with WASB resources, you must update your resources to use the new format. See Changes to the APIs.
See Changes to the APIs.
Custom dictionaries:
In a future release, custom dictionaries that rely on an uploaded file will be deprecated. The specific release vehicle has not been determined yet.
Deprecation only affects the ability to create custom types using a file. Where possible, you can and should continue to create custom times using regular expressions. For more information, see Create Custom Data Types Using RegEx.
The file-based feature will be replaced by a standardization-based option.
Beginning in this release, this feature is disabled by default.
Parameter overrides:
If you have upgraded to Release 7.1 or later, any parameter overrides that you have specified in your flows can be modified in the Overrides tab of the Manage Parameters dialog.
For more information, see Manage Parameters Dialog.
WASB and ADLS:
Configuration to enable WASB and ADLS access has been streamlined and simplified.
Note
No action is required for upgrading customers.
See WASB Access.
See ADLS Gen1 Access.
Secure Token Service:
The default port number for the secure token service has been changed from
8090
. The new default port number is41921
.Note
Your upgraded installation is forced to use this new port number. You can modify the value after installation or upgrade.
Sharing:
The Send a Copy feature is no longer available in the product. Instead, you can make a copy of the flow and share it. See Flow View Page.
Language:
All MODE functions return the lowest value in a set of values if there is a tie in the evaluation. See Changes to the Language.
Key Bug Fixes
Ticket | Description |
---|---|
TD-48245 | By default, under SSO manual logout and session expiration logout redirect to different pages. Manual logout directs you to SAML sign out, and session expiry produces a session expired page. To redirect the user to a different URL on session expiry, an administrator can set the following parameter: |
New Known Issues
Ticket | Description |
---|---|
TD-52221 | You cannot update your AWS configuration for per-user or per-workspace mode via UI. Tip You can switch to using AWS system mode with a single, system wide configuration, or you can use the APIs to make changes.SeeAPI Task - Manage AWS Configurations. |
TD-49559 | Cannot select and apply custom data types through column Type menu. Tip You can change the type of the column as a recipe step. Use the Change column type transformation. From the New type drop-down, select |
TD-47784 | When creating custom datasets using SQL from Teradata sources, the |
TD-47473 | Uploaded files (CSV, XLS, PDF) that contain a space in the filename fail to be converted. Tip Remove the space in the filename and upload again. |