Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

TicketComponentDescription
TD-40348Transformer Page

When loading a recipe in imported flow that references an imported Excel dataset, Transformer page displays Input validation failed: (Cannot read property 'filter' of undefined) error, and the screen is blank. 

Tip

Workaround: In Flow View, select an output object, and run a job. Then, load the recipe in the Transformer page and generate a new sample. For more information, see Import Flow.


TD-35714Installer/Upgrader/Utilities

After installing on Ubuntu 16.04 (Xenial), platform may fail to start with "ImportError: No module named pkg_resources" error.

Tip

Workaround: Verify installation of python-setuptools package. Install if missing.


TD-35644Compilation/ExecutionExtractpatterns for "HTTP Query strings" option doesn't work.
TD-35562Compilation/Execution

When executing Spark 2.3.0 jobs on S3-based datasets, jobs may fail due to a known incompatibility between HTTPClient:4.5.x and aws-java-jdk:1.10.xx. For details, see https://github.com/apache/incubator-druid/issues/4456.

Tip

Workaround: Use Spark 2.1.0 instead.In Admin Settings page, configure the spark.version property to 2.1.0. For more information, see Admin Settings Page.

For additional details on Spark versions, see Configure for Spark.

TD-35504Compilation/Execution

Clicking Cancel Job button generates a 405 status code error. Click Yes button fails to close the dialog.

Tip

Workaround: After you have clicked the Yes button once, you can click the No button. The job is removed from the page.


TD-35486Compilation/Execution

Spark jobs fail on LCM function that uses negative numbers as inputs.

Tip

Workaround: If you wrap the negative number input in the ABS function, the LCM function may be computed. You may have to manually check if a negative value for the LCM output is applicable.


TD-35483Compilation/Execution

Differences in how WEEKNUM function is calculated in Photon and Spark running environments, due to the underlying frameworks on which the environments are created.

  • Photon week 1 of the year: The week that contains January 1.
  • Spark week 1 of the year: The week that contains at least four days in the specified year.

For more information, see WEEKNUM Function.

TD-35478Compilation/Execution
The Spark running environment does not support use of multi-character delimiters for CSV outputs. For more information on this issue, see https://issues.apache.org/jira/browse/SPARK-24540.
Tip

Workaround: You can switch your job to a different running environment or use single-character delimiters.


TD-34840Transformer PagePlatform fails to provide suggestions for transformations when selecting keys from an object with many of them.
TD-34119Compilation/ExecutionWASB job fails when publishing two successive appends.
TD-30855 Publish

Creating dataset from Parquet-only output results in "Dataset creation failed" error.

Info

NOTE: If you generate results in Parquet format only, you cannot create a dataset from it, even if the Create button is present.


TD-30828Publish

You cannot publish ad-hoc results for a job when another publishing job is in progress for the same job.

Tip

Workaround: Please wait until the previous job has been published before retrying to publish the failing job.


TD-27933Connectivity

For multi-file imports lacking a newline in the final record of a file, this final record may be merged with the first one in the next file and then dropped in the Photon running environment.

Tip

Workaround: Verify that you have inserted a new line at the end of every file-based source.


...