August 21, 2020
Support for PostgreSQL 12.3 for .
NOTE: For this release, PostgreSQL 12.3 is supported for supported versions of CentOS/RHEL 7 only. See Product Support Matrix.
NOTE: In a future release, support for PostgreSQL 9.6 will be deprecated. For more information, see Upgrade Databases for PostgreSQL.
Schema information is retained:
When schematized datasources are ingested, schema information is now retained for publication of job results.
NOTE: In prior releases, you may have set column data types manually because this schema information was lost during the ingest process. You may need to remove these manual steps from your recipe. For more information, see Improvements to the Type System.
If you are upgrading your cluster to CDH 6.3.3, please set the following property to the value listed below:
Save your changes and restart the platform. For more information, see Admin Settings Page.
For more information, see Configure for Spark.
|TD-53062||After upgrade, imported recipe has UDF steps converted to comments.|
On Azure Databricks, creating a stratified sample fails.
Cannot run Azure Databricks jobs on ADLS-Gen1 cluster in user mode.
UnknownHostException error when generating Azure Databricks access token from Secure Token Service
Cannot import some Parquet files into the platform.
Import data page is taking too long to load.
Closing the connections search bar removes search bar and loses sort order.
On upgrade, Spark is incorrectly parsing files of type "UTF-8 Unicode (with BOM)."
Import rules not working for remapping of WASB bucket name. For more information, see Define Import Mapping Rules.
Cannot import flow due to missing associated flownode error.
Server Save error when deleting a column.
Transformation engine unavailable due to prior crash
After upgrade, you cannot edit recipes or run jobs on recipes that contain the optional
Optional file cleanup generates confusing error logging when it fails.
When modifying file privileges, the platform makes assumptions about database usernames.
On upgrade, the migration framework for the authorization service is too brittle for use with Amazon RDS database installations.
When flows are imported into the Deployment Manager, additional characters are inserted into parameterized output paths, causing job failures.
PostgreSQL connections may experience out of memory errors due to incorrectly specified fetch size and vendor configuration.
Can't import a flow that contains a reference in a flow webhook task to a deleted output.
Generic Hadoop folder is missing in
After upgrade, you cannot publish as a single-file to WASB to replace an existing output destination.
After upgrade, users cannot load recipes due to Requested Data Not Found error when loading samples.
After upgrading Cloudera cluster to version 6.3.3, you cannot run jobs due to the following error:
Please see "Cloudera support" above.
During upgrade, cross-migration fails for authorization service and its database with the following error:
After upgrade, ad-hoc publish to Hive fails.
After upgrade, you cannot unzip downloaded log files.
After upgrade, cross-migration validation fails for "groupsPolicies."
Tripache Vulnerabilities - CVE-2020-1927
May 4, 2020
Have a question about the product? Use the new in-app chat feature to explore content or ask a question to our support staff. If you need assistance, please reach out!
NOTE: User messaging may require enablement in your deployment. See Enable In-App Chat.
NOTE: If you are installing or upgrading a deployment of that uses or will use a remote database service, such as Amazon RDS, for hosting the , please contact . For this release, additional configuration may be required.
Support for installation on CentOS/RHEL 8. See System Requirements.
NOTE: SSO using SAML is not supported on CentOS/RHEL 8. See Configure SSO for SAML.
NOTE: Support for CentOS/RHEL 6 has been deprecated. Please upgrade to CentOS/RHEL 8.
Support for EMR 5.28.1 and EMR 5.29.0
NOTE: EMR 5.28.0 is not supported, due to Spark compatibility issues.
NOTE: Support for EMR 5.8 - EMR 5.12 is deprecated. For more information, see End of Life and Deprecated Features.
Support for installation on Ubuntu 18.04 (Bionic Beaver). See System Requirements.
NOTE: Support for installation on Ubuntu 14.04 (Trusty) has been deprecated. See End of Life and Deprecated Features.
Improved performance for Oracle, SQL Server, and DB2 connections. These performance improvements will be applied to other relational connections in future releases.
NOTE: For more information on enabling this feature, please contact .
NOTE: To enable these additional read/write capabilities through Databricks Tables, the underlying connection was changed to use a Simba driver. In your connection definition, any Connect String Options that relied on the old Hive driver may not work. For more information, see Configure for Azure Databricks.
Introducing plans. A plan is a sequence of tasks on one or more flows that can be scheduled.
NOTE: In this release, the only type of task that is supported is Run Flow.
Create flow parameters that you can reference in your flow. Flow parameters can be string literals, , or regular expression patterns.
NOTE: For this release, flow parameters can be applied into your recipes only.
As needed, you can apply overrides to the parameters in your flow or to downstream flows.
NOTE: Flow parameters do not apply to datasets or output objects, which have their own parameters. However, if you specify an override at the flow level, any parameters within the flow that use the same name receive the override value, including output object parameters and datasets with parameters.
NOTE: Azure Databricks 5.3 and 5.4 are no longer supported. Please upgrade to Azure Databricks 5.5 LTS or 6.x. See End of Life and Deprecated Features.
Support for generating results and publishing to Tableau Hyper format.
NOTE: Tableau TDE format will be deprecated in a future release. Please switch to using Tableau Hyper format.
If you have upgraded to Tableau Server 10.5 or later, you may have a mix of TDE and Hyper files stored on the server. You can automatically upgrade the TDE files to Hyper, if needed. For more information, see https://help.tableau.com/current/online/en-us/extracting_upgrade.htm.
The is no longer available in the software distribution and has been deprecated. Please switch to a supported browser version. For more information, see Desktop Requirements.
A Release 6.8 version of the can be made available upon request. For more information, please contact .
All workspace admins now have access to all user-created objects within the workspace.
NOTE: Workspace administrators can access some types of user-created objects in the workspace with the same level of access as the object owner. Under some conditions, workspace admins may have access to source datasets and generated results. See Workspace Admin Permissions.
API reference documentation is now available directly through the application. This release includes more supported endpoints and documented options. To access, select Help menu > API Documentation.
NOTE: API reference content is no longer available with the product documentation. Please use the in-app reference documentation instead.
Workflow documentation is still available with the product documentation. For more information, see API Reference.
Upgrade to NodeJS 12.16.1.
NOTE: This dependency is specific to the . For this release, a separate installation of is required for installing or upgrading the platform.
See Install on Ubuntu.
The format of the supported WASB URIs has changed.
NOTE: If you were using the APIs to interact with WASB resources, you must update your resources to use the new format. See Changes to the APIs.
In a future release, custom dictionaries that rely on an uploaded file will be deprecated. The specific release vehicle has not been determined yet.
WASB and ADLS:
Configuration to enable WASB and ADLS access has been streamlined and simplified.
NOTE: No action is required for upgrading customers.
The default port number for the secure token service has been changed from
8090. The new default port number is
NOTE: Your upgraded installation is forced to use this new port number. You can modify the value after installation or upgrade.
By default, under SSO manual logout and session expiration logout redirect to different pages. Manual logout directs you to SAML sign out, and session expiry produces a session expired page.
To redirect the user to a different URL on session expiry, an administrator can set the following parameter:
You cannot update your AWS configuration for per-user or per-workspace mode via UI.
Cannot select and apply custom data types through column Type menu.
|TD-47784||When creating custom datasets using SQL from Teradata sources, the |
Uploaded files (CSV, XLS, PDF) that contain a space in the filename fail to be converted.