December 6, 2019
Welcome to Release 6.8 of . This release introduces several key features around operationalizing the platform across the enterprise. Enterprise stakeholders can now receive email notifications when recurring jobs have succeeded or failed, updating data consumers outside of the platform. This release also introduces a generalized webhook interface, which facilitates push notifications to applications such as Slack when jobs have completed. When jobs fail, users can download a much richer support bundle containing configuration files, script files, and a specified set of log files.
Macros have been expanded to now be export- and import-ready across environments. In support of this feature, the Wrangle Exchange is now available through the , where you can download macros created by others and import them for your own use. Like macros, you can now export and import flows across product editions and release (Release 6.8 or later only).
In the application, you can now use shortcut keys to navigate around the workspace and the Transformer page. And support for the Firefox browser has arrived. Read on for more goodness added with this release.
Install:
Workspace:
Browser:
Project Management:
Operationalization:
Enable and configure email notifications based on the success or failure of job executions.
NOTE: This feature requires access to an SMTP server. See Enable SMTP Email Server Integration. |
Supportability:
Connectivity:
Create connections to Databricks Tables.
NOTE: This connection is supported only when the |
For more information, see Create Databricks Tables Connections.
Import:
As of Release 6.8, you can import an exported flow into any edition or release after the build number of the export. See Import Flow.
Improved monitoring of long-loading relational sources. See Import Data Page.
NOTE: This feature must be enabled. See Configure JDBC Ingestion. |
Transformer Page:
Improved Date/Time format selection. See Choose Datetime Format Dialog.
Tip: Datetime formats in card suggestions now factor in the user's locale settings for greater relevance. |
Job execution:
If you enabled visual profiling for your job, you can download a JSON version of the visual profile. See Job Details Page.
Language:
API:
Browser Support Policy:
For supported browsers, at the time of release, the latest stable version and the two previous stable versions are supported.
NOTE: Stable browser versions released after a given release of |
For more information, see Desktop Requirements.
Install:
NOTE: In the next release of |
Import/Export:
CLI and v3 endpoints (Release 6.4):
NOTE: Do not attempt to connect to the |
In Release 6.4:
Ticket | Description |
---|---|
TD-40348 | When loading a recipe in an imported flow that references an imported Excel dataset, Transformer page displays Input validation failed: (Cannot read property 'filter' of undefined) error, and the screen is blank. |
TD-42080 | Cannot run flow or deployment that contains more than 10 recipe jobs |
Ticket | Description | ||||
---|---|---|---|---|---|
TD-45923 | Publishing a compressed Snappy file to SFTP fails. | ||||
TD-45922 | You cannot publish TDE format to SFTP destinations. | ||||
TD-45492 | Publishing to Databricks Tables fails on ADLS Gen1 in user mode. | ||||
TD-45273 | Artifact Storage Service fails to start on HDP 3.1.
| ||||
TD-45122 | API: re-running job using only the
| ||||
TD-44429 | Cannot publish outputs to relational targets, receiving
| ||||
TD-44427 | Cannot publish dataset containing duplicate rows to Teradata. Error message:
|