Page tree

Release 7.1.2


Contents:

   

Contents:


Release 6.8.2

April 27, 2020

What's New

Changes in System Behavior

None.

Key Bug Fixes

TicketDescription
TD-48245

By default, under SSO manual logout and session expiration logout redirect to different pages. Manual logout directs you to SAML sign out, and session expiry produces a session expired page.

To redirect the user to a different URL on session expiry, an administrator can set the following parameter: webapp.session.redirectUriOnExpiry. This parameter applies to the following SSO environments:


New Known Issues

TicketDescription
TD-48630

Connection files used by the data service are not persisted in a Dockerized environment.

Workaround: In the Admin Settings page, set data-service.vendor to a location that is persisted. Example value:

(Path-to-persistent-directory)/conf/data-service/application.properties
TD-47696

Platform appears to fail to restart properly through Admin Settings page due to longer restarts of individual services. Symptoms:

  • Changes to settings may appear to have not been applied.
  • Admin Settings page appears to be stuck restarting.

Workaround: Restart can take up to several minutes. If the restart does not appear to complete, try reloading the page. If that doesn't work, restarting from the command line is more reliable. See Start and Stop the Platform.


Release 6.8.1

February 7, 2020

This release enables some new features and makes some relational connections generally available. 

What's New

Install:

Import:

  • Upload tabular data from PDF documents.

    NOTE: This feature is in Beta release.

    NOTE: This feature must be enabled.

    See Import PDF Data.

  • Read support for ORC tables managed through Hive. See Configure for Hive.

LDAP:

Cluster Clean:

Documentation:

Changes to System Behavior

Wrangler Enterprise desktop application:

NOTE: In a future release, the Wrangler Enterprise desktop application will be deprecated. Please switch to a supported version of Google Chrome or Mozilla Firefox. Support for Edge Chromium is expected in a future release. See Desktop Requirements.

General availability:

  • The following relational connections are now generally available:
    • DB2 (import only)
    • Salesforce (import only)
    • Tableau Server (publish only)
      For more information, see Connection Types.

Key Bug Fixes

TicketDescription
TD-45492

Publishing to Databricks Tables fails on ADLS Gen1 in user mode.

New Known Issues

TicketDescription
TD-47263

Importing an exported flow that references a Google Sheets or Excel source breaks connection to input source.

Workaround: If the importing user has access to the source, the user can re-import the dataset and then swap the source for the broken recipe.


Release 6.8

December 6, 2019

Welcome to Release 6.8 of  Designer Cloud Enterprise Edition. This release introduces several key features around operationalizing the platform across the enterprise. Enterprise stakeholders can now receive email notifications when recurring jobs have succeeded or failed, updating data consumers outside of the platform. This release also introduces a generalized webhook interface, which facilitates push notifications to applications such as Slack when jobs have completed. When jobs fail, users can download a much richer support bundle containing configuration files, script files, and a specified set of log files. 

Macros have been expanded to now be export- and import-ready across environments. In support of this feature, the Wrangle Exchange is now available through the Alteryx Community, where you can download macros created by others and import them for your own use. Like macros, you can now export and import flows across product editions and release (Release 6.8 or later only). 

In the application, you can now use shortcut keys to navigate around the workspace and the Transformer page. And support for the Firefox browser has arrived. Read on for more goodness added with this release.

What's New

Install:

Workspace:

  • Individual users can now enable or disable keyboard shortcuts in the workspace or Transformer page. See User Profile Page.
  • Configure locale settings at the workspace or user level. See Locale Settings.
  • You can optionally duplicate the datasets from a source flow when you create a copy of it. See Flow View Page.
  • Create a copy of your imported dataset. See Library Page.

Browser:

  • Support for Firefox browser. 
    NOTE: This feature is in Beta release.

    For supported versions, see Desktop Requirements.

Project Management:

Operationalization:

Supportability:

  • Download logs bundle on job success or failure now contains extensive configuration information to assist in debugging. For more information, see Configure Support Bundling.

Connectivity:

  • Support for integration with EMR 5.8 - 5.27. For more information, see Configure for EMR. 

  • Create connections to Databricks Tables.

    NOTE: This connection is supported only when the Designer Cloud Powered by Trifacta platform is connected to an Azure Databricks cluster.

    For more information, see Create Databricks Tables Connections.


  • Support for using non-default database for your Snowflake stage.

Import:

  • As of Release 6.8, you can import an exported flow into any edition or release after the build number of the export. See Import Flow

  • Improved monitoring of long-loading relational sources. See Import Data Page.

    NOTE: This feature must be enabled. See Configure JDBC Ingestion.

Transformer Page:

  • Select columns, functions applied to your source, and constants to replace your current dataset. See Select.
  • Improved Date/Time format selection. See Choose Datetime Format Dialog.

    Tip: Datetime formats in card suggestions now factor in the user's locale settings for greater relevance.

  • Improved matching logic and performance when matching columns through RapidTarget. 
    • Align column based on the data contained in them, in addition to column name. 
    • This feature is enabled by default. For more information, see Overview of RapidTarget.
  • Improvements to the Search panel enable faster discovery of transformations, functions, and other objects. See Search Panel.

Job execution:

  • By default, the Designer Cloud application permits up to four jobs from the same flow to be executed at the same time. If needed, you can configure the application to execute jobs from the same flow one at a time. See Configure Application Limits.
  • If you enabled visual profiling for your job, you can download a JSON version of the visual profile. See Job Details Page.


Language:

API:

  • Apply overrides at time of job execution via API. 
  • Define import mapping rules for your deployments that use relational sources or publish to relational targets. 
  • Export and import macro definitions.
  • See Changes to the APIs.

Changes in System Behavior

Browser Support Policy:

  • For supported browsers, at the time of release, the latest stable version and the two previous stable versions are supported.

    NOTE: Stable browser versions released after a given release of Designer Cloud Enterprise Edition will NOT be supported for any prior version of Designer Cloud Enterprise Edition.  A best effort will be made to support newer versions released during the support lifecycle of the release.

    For more information, see Desktop Requirements.

Install:

NOTE: In the next release of Designer Cloud Enterprise Edition after Release 6.8, support for installation on CentOS/RHEL 6.x and Ubuntu 14.04 will be deprecated. You should upgrade the Alteryx node to a supported version of CentOS/RHEL 7.x or Ubuntu 16.04. Before performing the upgrade, please perform a full backup of the Designer Cloud Powered by Trifacta platform and its databases. See Backup and Recovery.


  • Support for Spark 2.1 has been deprecated. Please upgrade to a supported version of Spark.
    • Support for EMR 5.6 and eMR 5.7 has also been deprecated. Please upgrade to a supported version of EMR. 
    • For more information, see Product Support Matrix.
  • To simplify the installation distribution, the Hadoop dependencies for the recommended version only are included in the software download. For the dependencies for other supported Hadoop distributions, you must download them from the Alteryx FTP site and install them on the Alteryx node. See Install Hadoop Dependencies.  
  • Alteryx node has been upgraded to use Python 3. This instance of Python has no dependencies on any Python version external to the Alteryx node.

Import/Export:

CLI and v3 endpoints (Release 6.4):

NOTE: Do not attempt to connect to the Designer Cloud Powered by Trifacta platform using any version of the CLI or the v3 endpoints. They are no longer supported and unlikely to work.

In Release 6.4:

  • The Command Line Interface (CLI) was deprecated. Customers must use the v4 endpoints for the APIs instead.
  • The v3 versions of the API endpoints were deprecated. Customers must use the v4 endpoints for the APIs instead.
  • Developer content was provided to assist in migrating to the v4 endpoints. 
  • For more information on acquiring this content, please contact Alteryx Support.

Key Bug Fixes

TicketDescription
TD-40348

When loading a recipe in an imported flow that references an imported Excel dataset, Transformer page displays Input validation failed: (Cannot read property 'filter' of undefined) error, and the screen is blank.

TD-42080Cannot run flow or deployment that contains more than 10 recipe jobs


New Known Issues

TicketDescription
TD-46123

Cannot modify the type of relational target for publishing action.

Workaround: Create a new publishing action with the desired relational target. Remove the original one if necessary. See Run Job Page.

TD-45923Publishing a compressed Snappy file to SFTP fails.
TD-45922You cannot publish TDE format to SFTP destinations.
TD-45492

Publishing to Databricks Tables fails on ADLS Gen1 in user mode.

TD-45273

Artifact Storage Service fails to start on HDP 3.1.

Workaround: The Artifact Storage Service can reference the HDP 2.6 Hadoop bundle JAR.

Steps:

  1. You can apply this change through the Admin Settings Page (recommended) or trifacta-conf.json. For more information, see Platform Configuration Methods.
  2. Locate the following property:

    "artifact-storage-service.classpath"
  3. Replace this value:

    :%(topOfTree)s/%(hadoopBundleJar)s
  4. With the following:

    :%(topOfTree)s/conf/hadoop-site/:%(topOfTree)s/hadoop-deps/hdp-2.6/build/libs/hdp-2.6-bundle.jar
  5. Save changes and restart the platform.
TD-45122

API: re-running job using only the wrangleDataset identifier fails even if the original job succeeds when writeSettings were specified.

Workaround: Use a full jobGroups job specification each time that you run a job.

See https://api.trifacta.com/ee/7.1/index.html#operation/runJobGroup

TD-44429

Cannot publish outputs to relational targets, receiving Encountered error while processing stream.

Workaround: This issue may be caused by the trifacta service account not having write and execute permissions to the /tmp directory on the Alteryx node.

If so, you can do either of the following:

  1. Enable write and execute permissions for the account on /tmp.
  2. Create a new temporary account and provide the service account write and execute permissions to it. Then, add the following to data-service.jvmOptions:

    -Dorg.xerial.snappy.tempdir=/new/directory/with/writeexecuteaccess
TD-44427

Cannot publish dataset containing duplicate rows to Teradata. Error message:

Caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.10.00.14] [Error -2802] [SQLState 23000] Duplicate row error in abc_trifacta.tmp_218768523.
 at 

Workaround: This is a known limitation on Teradata. For more information on this limitation, see Enable Teradata Connections.

This page has no comments.