Contents:
Release 4.2.2
This release includes bug fixes that were previously published as part of Hot Fixes for Release 4.2.1.
What's New
No new features.
Changes to System Behavior
SSL connections to Tableau Server are now functioning. See Create Tableau Server Connections.
Key Bug Fixes
Ticket | Description |
---|---|
TD-31118 | Update Python to 2.7.14 |
TD-30307 | Update Tomcat libraries to 8.5.28 |
TD-30239 | Tableau publish doesn't allow non-default site and project names with uppercase or spaces. |
TD-30210 | TDE generation does not include data when the job is run on Spark. |
TD-29954 | Tableau SSL support. See Create Tableau Server Connections. |
TD-29777 | Dataset owner cannot edit custom SQL. |
TD-29562 | Cannot change user's upload folder |
TD-29552 | Data Service is running out of memory |
TD-29525 | Validation error for Decimal data type when publishing to Hive table |
TD-29514 | When creating a dataset from a Redshift connection, multiple copies of a column are present. |
TD-29387 | Spark job runner in batch job runner should not eat exceptions. |
TD-29284, TD-29478 | Cannot remove structure for shared datasets |
TD-29223 | Machine learning service fails to gracefully handle parallel requests. |
TD-29055, TD-29563 | Status Code 403 Forbidden when loading samples |
TD-28510 | Snapshots containing reused nodes contain multiple copies of each node. |
TD-28090 | Transformation engine crashes (error -1) when sample validation fails. |
New Known Issues
None.
Release 4.2.1
This release includes numerous bug fixes, support for new distributions, and new capabilities, such as the option to disable initial type inference on schematized sources.
What's New
Import:
- Enable or disable initial type inference for schematized sources at global or individual connection level, or for individual dataset sources. See Configure Type Inference.
Publishing:
- Support for publishing Datetime data to Hive Datetime or Timestamp data types. See Hive Data Type Conversions.
Install, Config & Admin:
Support for Ubuntu 16.04. See System Requirements.
Support for Cloudera 5.13. See Supported Deployment Scenarios for Cloudera.
NOTE: Support for CDH 5.10 has been deprecated. Please upgrade your Hadoop cluster. For more information, see End of Life and Deprecated Features.
Changes to System Behavior
None.
Key Bug Fixes
Ticket | Description |
---|---|
TD-27799 | DATEDIF function does not work for inputs that are functions returning date values. |
TD-27703 | Spark job fails with scala.MatchError |
TD-24121 | When publishing multi-part files, different permissions are written to the parent directory when job was executed on Hadoop or the Trifacta Photon running environment. |
New Known Issues
Ticket | Component | Description |
---|---|---|
TD-27950 | Transformer Page - Tools | When you join with an imported dataset not in your flow and it takes longer than expected to collect its initial sample, you may encounter the following error: Workaround: Create a recipe off of the imported dataset and then join to the recipe, which is the preferred method of joining. For more information, see Join Window. |
TD-27784 | Installer/Upgrader/Utilities | Ubuntu 16 install for Azure: supervisord complains about "missing" Python packages. Workaround: These packages are present but lack appropriate permissions. A workaround is documented as part of the installation and configuration process. For more information, see "Workaround for missing Python packages," see Configure for Azure . |
Release 4.2
This release introduces deployment management, which enables separation of development and production flows and their related jobs. Develop your flows in a Dev environment and, when ready, push to Prod, where they can be versioned and triggered for production execution. Additionally, you can create and manage all of your connections through the new Connections page. A revamped flow view streamlines object interactions and now supports starting and stopping of jobs without leaving flow view.
- Release 4.2 also supports installation of the platform on Amazon EC2 instances and integration with EMR as well as installation for Microsoft Azure.
Details are below.
What's New
Deployment Management:
- Manage the lifecycle process of flows across multiple platform instances, building in Dev and publishing to Prod. See Overview of Deployment Manager.
- Manage versions deployed into Production. See Deployment Manager Page.
Workspace:
New objects in Flow View and better organization of them. See Flow View Page.
NOTE: Wrangled datasets are no longer objects in the Designer Cloud Powered by Trifacta platform. Their functionality has been moved to other and new objects. For more information, see Changes to the Object Model.
See Object Overview.
- Create, manage, and share connections through the new Connections page. See Connections Page.
- Sharing of connections and flows is enabled by default. See Configure Sharing.
- Import and export flows from your platform instance.
- See Export Flow.
- See Import Flow.
- Cancel jobs in progress.
- See Flow View Page.
- See Jobs Page.
Transformer Page:
- Perform cross joins between datasets. See Join Window.
- Cut, copy, and paste columns and column values. See Column Browser Panel.
- Rename multiple columns in a single transformation step. See Rename Columns.
- In Column Details, you can select a phone number or date pattern to generate suggestions for standardizing the values in the column to a single format. See Column Details Panel.
Personalization:
- Personalized suggestions presented based on your previous usage.
- Browse and select patterns for re-use from your recent history.
Upload your own avatar image. See User Profile Page.
NOTE: This feature may need to be enabled. See Miscellaneous Configuration.
Install/Admin/Config:
- Install from Amazon Marketplace via AMI into a deployed EC2 instance.
- Leverage IAM roles to manage permissions for the Designer Cloud Powered by Trifacta platform deployed on an EC2 instance. See Configure for EC2 Role-Based Authentication.
- Install and integrate with Amazon Elastic MapReduce (EMR). See Configure for EMR.
- Install for Microsoft Azure and integrate with HDInsight. See Install from Azure Marketplace.
Integration:
- Redshift improvements:
- The Designer Cloud Powered by Trifacta platform supports multiple private and global connections to Redshift databases. See Create Redshift Connections.
- You can read from Redshift databases. See Redshift Browser.
- Publish directly to Tableau Server. See Run Job Page.
- For more information on creating the connection, see Create Tableau Server Connections.
Language:
- New string comparison functions.
- New SUBSTITUTE function replaces string literals or patterns with a new literal or column value.
- See Changes to the Language.
Import:
- Expanded set of encoding types supported for file import. See Configure Global File Encoding Type.
Performance:
- Improved performance when initializing jobs and in Flow View for complex flows.
Changes to System Behavior
New session duration parameter and default value
For technical reasons, the name and default value of the following parameter has been changed in Release 4.2.
Affected Releases | Parameter Name | Default Value | Max Value |
---|---|---|---|
Release 4.2 and later | webapp.session.DurationInMins | 10080 (one week) | 30000 |
Release 4.1.1 and earlier | webapp.session.DurationInMinutes | 43200 (one month) | 30000 |
NOTE: Upgrading customers have the new configuration setting automatically set to the default: 10080
minutes (one week). You must make adjustments as needed.
For more information on changing this parameter value, see Configure Application Limits.
/docs endpoint is removed
In Release 4.0, the /docs
endpoint was deprecated from use. This endpoint displayed a documentation page containing information on Wrangle language, the command line interface, and Alteryx patterns.
In Release 4.2, this endpoint has been removed from the platform. Content has been superseded by the following content:
- See Wrangle Language.
- See Text Matching.
For more information on features that have been deprecated or removed, see End of Life and Deprecated Features.
s3n is no longer supported
If you are integrating with S3 sources, the platform now requires use of the s3a protocol. The s3n protocol is no longer supported.
No configuration changes in the Designer Cloud Powered by Trifacta platform are needed. See Enable S3 Access.
Key Bug Fixes
Ticket | Description |
---|---|
TD-27748 | Direct publish to Hive fails on wide datasets due to Avro limitations. |
TD-27368 | SQL Server Database timing out with long load times.
|
TD-27197 | Column histogram does not update after adding pluck parameter to unnest transform. |
TD-27127 | Send a Copy tab in Flow View sharing does not include all available users. |
TD-27055 | Job run on flow with complex recipes fails on Hadoop but succeeds on the Trifacta Photon running environment. |
TD-26837 | Creating custom dictionaries fails on S3 backend datastore. |
TD-26388 | Orphaned bzip2 processes owned by the platform user accumulate on the node. |
TD-26041 | When editing a schedule that was set for 0 minutes after the hour, the schedule is displayed to execute at 15 minutes after the hour. |
TD-25903 | Overflow error when ROUND function is applied to large values. |
TD-25733 | Attempting a union of 12 datasets crashes UI. |
TD-25709 | Spark jobs fail if HDFS path includes commas. |
New Known Issues
Ticket | Component | Description |
---|---|---|
TD-27799 | Compilation/Execution | DATEDIF function does not work for inputs that are functions returning date values. Workaround: Write function returning your date values to a new column. Then, apply DATEDIF function using that column as a new input. |
TD-27703 | Compilation/Execution | Spark job fails with scala.MatchError |
TD-26069 | Compilation/Execution | The Trifacta Photon running environment evaluates |
TD-24121 | Compilation/Execution | When publishing multi-part files, different permissions are written to the parent directory when job was executed on Hadoop or the Trifacta Photon running environment. |
This page has no comments.