Page tree

Release 7.6.2


Contents:

   

Contents:


This section contains information on the fie formats and compression schemes that are supported for input to and output of  Designer Cloud Enterprise Edition

NOTE: To work with formats that are proprietary to a desktop application, such as Microsoft Excel, you do not need the supporting application installed on your desktop.

Filenames

NOTE: Filenames that include special characters can cause problems during import or when publishing to a file-based datastore.

Forbidden characters in import filenames:


Tip: This list may not be complete for all available running environments.



Running EnvironmentForbidden characters
General
"/"
Spark
"{", "*", "\"
Photon
"\"

Native Input File Formats

Designer Cloud Enterprise Edition can read and import directly these file formats:

  • Excel (XLS/XLSX)

    NOTE: Other Excel-related formats, such as XLSM format, are not supported.

    NOTE: The hashtag character (#) is not supported in filenames for this file format.


    Tip: You may import multiple worksheets from a single workbook at one time. See Import Excel Data in the User Guide.


  • PDF

    NOTE: PDF support may need to be enabled in your environment. See Import PDF Data.


  • CSV
  • JSON, including nested

    NOTE: Designer Cloud Enterprise Edition requires that JSON files be submitted with one valid JSON object per line. Consistently malformed JSON objects or objects that overlap linebreaks might cause import to fail. See Initial Parsing Steps in the User Guide

  • Plain Text
  • LOG
  • TSV
  • Parquet

    NOTE: When working with datasets sourced from Parquet files, lineage information and the $sourcerownumber reference are not supported.


  • Avro

  • XML

    Tip: XML files can be ingested as unstructured text.


For more information on data is handled initially, see Initial Parsing Steps in the User Guide.

Native Output File Formats

Designer Cloud Enterprise Edition can write to these file formats:

  • CSV
  • JSON


  • Tableau Hyper

    NOTE: Publication of results in Hyper format may require additional configuration. See below.

  • Tableau TDE

    NOTE: TDE has been superseded by the Hyper format. Please switch to using Hyper format. TDE will be deprecated in a future release.

  • Avro

    NOTE: The Trifacta Photon and Spark running environments apply Snappy compression to this format.

  • Parquet

    NOTE: The Trifacta Photon and Spark running environments apply Snappy compression to this format.


Compression Algorithms

NOTE: Import of a compressed file whose underlying format is binary, such as Excel or PDF, is not supported.


NOTE: Importing a compressed file with a high compression ratio can overload the available memory for the application. In such cases, you can decompress the file before uploading. If decompression fails, you should contact your administrator about increasing the Java Heap Size memory.

NOTE: Publication of results in Snappy format may require additional configuration. See below.

NOTE: GZIP files on Hadoop are not split across multiple nodes. As a result, jobs can crash when processing it through a single Hadoop task. This is a known issue with GZIP on Hadoop.

Where possible, limit the size of your GZIP files to 100 MB of less, or use BZIP2 as an alternative compression method. As a workaround, you can try to run the job on the unzipped file. You may also disable profiling for the job. See Run Job Page in the User Guide.



Read Native File Formats



GZIPBZIPSnappy
CSV SupportedSupportedSupported
JSONSupportedSupportedSupported
Avro

Supported
Hive

Supported




Write Native File Formats



GZIPBZIPSnappy
CSVSupportedSupportedSupported
JSONSupportedSupportedSupported
Avro

Supported; always on
Hive

Supported; always on




Additional Configuration for File Format Support

Publication of some formats requires execute permissions

When job results are generated and published in the following formats, the Designer Cloud Powered by Trifacta platform includes a JAR, from which is extracted a binary executable into a temporary directory. From this directory, the binary is then executed to generate the results in the proper format. By default, this directory is set to /tmp on the Alteryx node.

In many environments, execute permissions are disabled on /tmp for security reasons. Use the steps below to specify the temporary directory where this binary can be moved and executed.

Steps:

  1. Login to the application as an administrator.
  2. From the menu, select User menu > Admin console > Admin settings.
  3. For each of the following file formats, locate the listed parameter, where the related binary code can be executed:

    File FormatParameterSetting to Add
    Snappy
    "data-service.jvmOptions"
    -Dorg.xerial.snappy.tempdir=<some executable directory>
    TDE
    "batch-job-runner.jvmOptions"
    -Djna.tmpdir=<some executable directory>
    HyperSee previous.See previous.
  4. Save your changes and restart the platform.

  5. Run a job configured for direct publication of the modified file format.


This page has no comments.