D toc |
---|
Release 6.8
...
April 27, 2020
What's New
- Enhanced full-screen interface for importing using custom SQL. See Create Dataset with SQL.
Changes in System Behavior
None.
Key Bug Fixes
...
By default, under SSO manual logout and session expiration logout redirect to different pages. Manual logout directs you to SAML sign out, and session expiry produces a session expired page.
To redirect the user to a different URL on session expiry, an administrator can set the following parameter: webapp.session.redirectUriOnExpiry
. This parameter applies to the following SSO environments:
New Known Issues
...
Connection files used by the data service are not persisted in a Dockerized environment.
Tip |
---|
Workaround: In the Admin Settings page, set |
Code Block |
---|
(Path-to-persistent-directory)/conf/data-service/application.properties |
...
Platform appears to fail to restart properly through Admin Settings page due to longer restarts of individual services. Symptoms:
- Changes to settings may appear to have not been applied.
- Admin Settings page appears to be stuck restarting.
Tip |
---|
Workaround: Restart can take up to several minutes. If the restart does not appear to complete, try reloading the page. If that doesn't work, restarting from the command line is more reliable. See Start and Stop the Platform. |
Release 6.8.1
February 7, 2020
This release enables some new features and makes some relational connections generally available.
What's New
Install:
...
Info |
---|
NOTE: Support for CDH 6.0 has been deprecated. See End of Life and Deprecated Features. |
Import:
...
D beta |
---|
Info |
---|
NOTE: This feature must be enabled. |
...
...
LDAP:
- Support for initial binding to active directory using the user's account. See Configure SSO for AD-LDAP.
Cluster Clean:
- Cluster Clean standardization feature is now available in all product editions. See Overview of Cluster Clean.
...
- API: Improved documentation for the asset transfer endpoint. See Changes to the APIs.
Changes to System Behavior
: D s deskapp
Info | |
---|---|
NOTE: In a future release, the
|
General availability:
- The following relational connections are now generally available:
- DB2 (import only)
- Salesforce (import only)
- Tableau Server (publish only)
For more information, see Connection Types.
Key Bug Fixes
...
Publishing to Databricks Tables fails on ADLS Gen1 in user mode.
New Known Issues
...
Importing an exported flow that references a Google Sheets or Excel source breaks connection to input source.
Tip |
---|
Workaround: If the importing user has access to the source, the user can re-import the dataset and then swap the source for the broken recipe. |
Release 6.8
December 6, 2019
Welcome to Release 6.8 of
D s product | ||
---|---|---|
|
...
In the application, you can now use shortcut keys to navigate around the workspace and the Transformer page. And support for the Firefox browser has arrived. Read on for more goodness added with this release.
What's New
Install:
- Support for ADLS Gen2 blob storage. See Enable ADLS Gen2 Access.
Workspace:
- Individual users can now enable or disable keyboard shortcuts in the workspace or Transformer page. See User Profile Page.
- Configure locale settings at the workspace or user level. See Locale Settings.
- You can optionally duplicate the datasets from a source flow when you create a copy of it. See Flow View Page.
- Create a copy of your imported dataset. See Library Page.
...
- Create webhook notifications for third-party platforms based on results of your job executions. See Create Flow Webhook Task.
Enable and configure email notifications based on the success or failure of job executions.
Info NOTE: This feature requires access to an SMTP server. See Enable SMTP Email Server Integration.
- For more information on enabling, see Workspace Admin Page.
- Individual users can opt out of receiving email messages or can configure use of a different email address. See Email Notifications Page.
- For more information on enabling emails for individual flows, see Manage Flow Notifications Dialog.
- For more information on enabling, see Workspace Admin Page.
...
- Download logs bundle on job success or failure now contains extensive configuration information to assist in debugging. For more information, see see Configure Support Bundling.
...
- Support for integration with EMR 5.8 - 5.27. For more information, see Configure for EMR.
- Connect to SFTP servers to read data and write datasets. See Create SFTP Connections.
...
- Support for using non-default database for your Snowflake stage.
- Support for ingest from read-only Snowflake databases.
- See Enable Snowflake Connections.
...
- Select columns, functions applied to your source, and constants to replace your current dataset. See Select.
Improved Date/Time format selection. See Choose Datetime Format Dialog.
Tip Tip: Datetime formats in card suggestions now factor in the user's locale settings for greater relevance.
- Improved matching logic and performance when matching columns through RapidTarget.
- Align column based on the data contained in them, in addition to column name.
- This feature is enabled by default. For more information, see Overview of RapidTarget.
- Improvements to the Search panel enable faster discovery of transformations, functions, and other objects. See Search Panel.
Job execution:
- By default, the
permits up to four jobs from the same flow to be executed at the same time. If needed, you can configure the application to execute jobs from the same flow one at a time. See Configure Application Limits.D s webapp
If you enabled visual profiling for your job, you can download a JSON version of the visual profile. See Job Details Page.
- Support for instance pooling in Azure Databricks. See Configure for Azure Databricks.
Language:
- New trigonometry and statistical functions. See Changes to the Language.
API:
- Apply overrides at time of job execution via API.
- Define import mapping rules for your deployments that use relational sources or publish to relational targets.
- Export and import macro definitions.
- See Changes to the APIs.
Changes in System Behavior
Browser Support Policy:
For supported browsers, at the time of release, the latest stable version and the two previous stable versions are supported.
Info NOTE: Stable browser versions released after a given release of
will NOT be supported for any prior version ofD s product
. A best effort will be made to support newer versions released during the support lifecycle of the release.D s product For more information, see Desktop Requirements.
Install:
Info | |||
---|---|---|---|
NOTE: In the next release of
|
Support for Spark 2.1 has been deprecated. Please upgrade to a supported version of Spark.- Support for EMR 5.6 and eMR 5.7 has also been deprecated. Please upgrade to a supported version of EMR.
- For more information, see Product Support Matrix.
- To simplify the installation distribution, the Hadoop dependencies for the recommended version only are included in the software download. For the dependencies for other supported Hadoop distributions, you must download them from the
and install them on theD s item item FTP site
. See Install Hadoop Dependencies.D s node
has been upgraded to use Python 3. This instance of Python has no dependencies on any Python version external to theD s node
.D s node
...
Info | |
---|---|
NOTE: Do not attempt to connect to the
|
In Release 6.4:
- The Command Line Interface The Command Line Interface (CLI) was deprecated. Customers must use the v4 API endpoints instead.
- The v3 versions of the API endpoints were deprecated. Customers must use the v4 API endpoints instead.
- Developer content was provided to assist in migrating to the v4 API endpoints.
- For more information on acquiring this content, please contact contact
.D s support
Key Bug Fixes
Ticket | Description |
---|---|
TD-40348 | When loading a recipe in an imported flow that references an imported Excel dataset, Transformer page displays Input validation failed: (Cannot read property 'filter' of undefined) error, and the screen is blank. |
TD-42080 | Cannot run flow or deployment that contains more than 10 recipe jobs |
New Known Issues
Ticket | Description | TD-46123 | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Workaround: Create a new publishing action with the desired relational target. Remove the original one if necessary. See Run Job Page. | Publishing a compressed Snappy file to SFTP fails. | ||||||||||
TD-45922 | You cannot publish TDE format to SFTP destinations. | ||||||||||
TD-45492 | Publishing to Databricks Tables fails on ADLS Gen1 in user mode. | ||||||||||
TD-45273 | Artifact Storage Service fails to start on HDP 3.1.
Steps:
| ||||||||||
TD-45122 | API: re-running job using only the
| ||||||||||
TD-44429 | Cannot publish outputs to relational targets, receiving
| ||||||||||
TD-44427 | Cannot publish dataset containing duplicate rows to Teradata. Error message:
|
...