Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Support for Java 8 will be deprecated. Customers must migrate to using Java 11. Additional instructions will be provided.

    Tip

    Tip: For Release 9.2, Java 11 is supported at runtime only.


  • Java 11 requires Spark 3.x. When Java 8 is deprecated, support for Spark 2.x will be deprecated. Customers must migrate to using Spark 3.x.  Additional instructions will be provided.
  • These changes have the following implications:
    • Cloudera clusters do not support Spark 3.x. Customers using these running environments must migrate to Cloudera Data Platform.
    • Some deployments of EMR can migrate to using Spark 3.x in this release. For more information, see Configure for EMR.
    • Some deployments of Databricks can migrate to using Spark 3.x in this release. For more information:
  • For more information on these changes, please contact 
    D s support
    .

Key Bug Fixes

TicketDescription
TD-70522Cannot import converted files such as Excel, PDF, or JSON through SFTP connections.
TD-69279Test Connection button fails a ValidationFailed error when editing a working connection configured with SSH tunneling.
TD-69004Patch httpd to version 2.4.
52
54
TD-66185Flatten transformation cannot handle multi-character delimiters.

New Known Issues

None.

Release 9.1

...

Job execution:


The

D s webapp
can check for changes to your dataset's schemas before jobs are executed and optionally halt job execution to prevent data corruption.

...

Info

NOTE: Support for Databricks 5.5 LTS has been deprecated. For more information, see End of Life and Deprecated Features.


Key Bug Fixes

TicketDescription
TD-60881For ADLS datasets, parameter indicators in Flow View are shifted by one character.

New Known Issues

None.

Release 9.0

...

Deprecated

None.

Key Bug Fixes

TicketDescription
TD-68162

Flow parameters cannot be displayed or edited in the Transformer page and cannot embedded in recipe steps.

New Known Issues

None.

Release 8.11

...

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-68162

Flow parameters cannot be displayed or edited in the Transformer page and cannot embedded in recipe steps.

Tip

Workaround: To edit your flow parameters, select Parameters from the Flow View context menu.


Info

NOTE: There is no current workaround for embedding in recipe steps. While your existing parameters should continue to work at execution time, avoid changing names of your flow parameters or editing recipe steps in which they are referenced. New flow parameters cannot be used in recipes at this time.



Release 8.10

November 22, 2021

...

Maximum permitted record length has been increased from 1 MB to 20 MB. For more information, see Working with JSON v2.

Info

NOTE: This new default limit can be modified if you are experiencing performance issues. See Configure Application Limits.


Publishing:

Improvements to publishing of

D s item
itemDate values
to Snowflake. For more information, see Improvements to the Type System.

...

  • You can view the list of collaborators and their corresponding avatars on shareable objects, such as Flows, Plans, and Connections pages.

Sampling:


  • Adjust the size of samples loaded in the browser for your current recipe to improve performance and address low-memory conditions. See  Change Recipe Sample Size.

...

  • Example values in source CSV file:

    Code Block
    """My product""",In stock,"16,000",0.05

    Note that the value 16,000 must be double-quoted, since the value contains a comma, which is the field delimiter.

  • Previously, this value appeared in the Transformer page in columns as the following:

    c1c2c3c4
    """My product"""
    In stock
    "16,000"
    0.05


  • As of this version, the 

    D s webapp
     handles the values in a better manner when displaying them in the Transformer page:

    c1c2c3c4
    "My product"
    In stock
    16,000
    0.05
    • c1: Escaped values (tripe double-quotes) in the source no longer render in the application as triple double-quotes and are represented as quoted values.

    • c3: Note that the double quotes in c3 have been stripped. Leading and trailing quotes are trimmed if the quotes are balanced within a cell.

      Info

      NOTE: This change in behavior applies only to newly created imported datasets sourced from a CSV file. Existing imported datasets should not be affected. However, if a newly imported dataset is transformed by a previously existing recipe that compensated for the extra quotes in the Transformer page, the effects on output data could be unpredictable. These recipes and their steps should be reviewed.

      This change does apply to any newly imported dataset sourced from CSV and may cause the data to change. For example, if you export an older flow and import into a new workspace or project, this change in parsing behavior applies to the datasets that are newly created in the new environment. Recipes may require review upon import.


  • When results are generated in CSV, output files should continue to reflect the formatting of the source data before import. See above.

    Tip

    Tip: You can also choose the Include quotes option when creating a CSV output.

    • When profiling is enabled, values that appear in CSV as "" are now marked as missing.

...

Key Bug Fixes

None.

New Known Issues

TicketDescription
TD-63974

In imported datasets sourced from CSV files, double quotes that are escaped with a backslash ( \"backslash-escaped value\") can cause remainder of row to be compressed into a single cell.