Page tree

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.


TD-28930Delete other columns causes column lineage to be lost and reorders columns.
TD-28573Photon running environment executes column splits for fixed length columns using byte length, instead of character length. In particular, this issue affects columns containing special characters.
TD-27784Ubuntu 16 install for Azure: supervisord complains about "missing" Python packages.
TD-26069Photon evaluates date(yr, month, 0) as first date of the previous month. It should return a null value.

Security Fixes

The following security-related fixes were completed in this release.


In Apache Log4j 2.x before 2.8.2, when using the TCP socket server or UDP socket server to receive serialized log events from another application, a specially crafted binary payload can be sent that, when deserialized, can execute arbitrary code.

See CVE-2017-5645.


Multiple integer overflows in libgfortran might allow remote attackers to execute arbitrary code or cause a denial of service (Fortran application crash) via vectors related to array allocation.

See CVE-2014-5044.


Hawk before 3.1.3 and 4.x before 4.1.1 allow remote attackers to cause a denial of service (CPU consumption or partial outage) via a long (1) header or (2) URI that is matched against an improper regular expression. Upgrade version of less to address this security vulnerability.

See CVE-2016-2515.


Spring Security (Spring Security 4.1.x before 4.1.5, 4.2.x before 4.2.4, and 5.0.x before 5.0.1; and Spring Framework 4.3.x before 4.3.14 and 5.0.x before 5.0.3) does not consider URL path parameters when processing security constraints. By adding a URL path parameter with special encodings, an attacker may be able to bypass a security constraint. The root cause of this issue is a lack of clarity regarding the handling of path parameters in the Servlet Specification. Some Servlet containers include path parameters in the value returned for getPathInfo() and some do not. Spring Security uses the value returned by getPathInfo() as part of the process of mapping requests to security constraints. In this particular attack, different character encodings used in path parameters allows secured Spring MVC static resource URLs to be bypassed.

See CVE-2018-1199.


Apache POI in versions prior to release 3.15 allows remote attackers to cause a denial of service (CPU consumption) via a specially crafted OOXML file, aka an XML Entity Expansion (XEE) attack.

See CVE-2017-5644.


math.js before 3.17.0 had an arbitrary code execution in the JavaScript engine. Creating a typed function with JavaScript code in the name could result arbitrary execution.

See CVE-2017-1001002.


If a user of Commons-Email (typically an application programmer) passes unvalidated input as the so-called "Bounce Address", and that input contains line-breaks, then the email details (recipients, contents, etc.) might be manipulated. Mitigation: Users should upgrade to Commons-Email 1.5. You can mitigate this vulnerability for older versions of Commons Email by stripping line-breaks from data, that will be passed to Email.setBounceAddress(String).

See CVE-2018-1294.


 Apache Commons FileUpload before 1.3.3 DiskFileItem File Manipulation Remote Code Execution

See CVE-2016-1000031.

New Known Issues


When creating Tableau Server connections, the Test Connection button is missing.


Workaround: Create the connection. Create a very simple dataset with minimal recipe. Run a job on it. From the Export Results window, try to publish to Tableau Server. If you cannot connect to the Tableau Server, try specifying a value for the Site Name in the Export Results window.


Copying a flow invalidates the samples in the new copy. Copying or moving a node within a flow invalidates the node's samples.


NOTE: This issue also applies to flows that were upgraded from a previous release.


Workaround: Recreate the samples after the move or copy.

TD-31252Transformer Page - Tools

Assigning a target schema through the Column Browser does not refresh the page.


Workaround: To update the page, reload the page through the browser.


Job results are incorrect when a sample is collected and then the last transform step is undone.


Workaround: Recollect a sample after undoing the transform step.


Matching file path patterns in a large directory can be very slow, especially if using multiple patterns in a single dataset with parameters.


Workaround: To increase matching speed, avoid wildcards in top-level directories and be as specific as possible with your wildcards and patterns.


When creating a new dataset from the Export Results window from a CSV dataset with Snappy compression, the resulting dataset is empty when loaded in the Transformer page.


Workaround: Re-run the job with Snappy compression disabled. Then, export the new dataset.

TD-30820Compilation/ExecutionSome string comparison functions process leading spaces differently when executed on the Photon or the Spark running environment.
TD-30717ConnectivityNo validation is performed for Redshift or SQL DW connections or permissions prior to job execution. Jobs are queued and then fail.

Spark job run on ALDS cluster fails when Snappy compression is applied to the output.


Workaround: Job execution should work if Snappy compression is installed on the cluster.

TD-30342ConnectivityNo data validation is performed during publication to Redshift or SQL DW.

Redshift: No support via CLI or API for:

  • creating Redshift connections,
  • running jobs on data imported from Redshift,
  • publishing jobs results to Redshift

Workaround: Please execute these tasks through the application.

TD-30074Type System

Pre-import preview of Bigint values from Hive or Redshift are incorrect.


Workaround: The preview is incorrect. When the dataset is imported, the values are accurate.


In reference dataset, UDF from the source dataset is not executed if new recipe contains a join or union step.


Workaround: Publish the source dataset. In the Export Results window, create a new dataset from the results. Import it as your reference data.


When the platform is restarted or an HA failover state is reached, any running jobs are stuck forever In Progress.