Supported File Formats
This section contains information on the fie formats and compression schemes that are supported for input to and output from Alteryx Analytics Cloud (AAC).
Note
To work with formats that are proprietary to a desktop application, such as Microsoft Excel, you do not need the supporting application installed on your desktop.
Filenames
Note
During import, AAC identifies file formats based on the extension of the filename. If no extension is provided, AAC assumes that the submitted file is a text file of some kind. Non-text file formats, such as Avro and Parquet, require filename extensions.
Note
Filenames that include special characters can cause problems during import or when publishing to a file-based datastore.
File Path Length Limits
Maximum character limits for file paths:
File paths to sources for imported datasets:
1024
Tip
This limit (
storagelocations
) applies to both files and tables.File paths to output files:
2048
Tip
This limit (
writesettings
) applies to files stored on any file-based storage location.
Forbidden Characters in Import Filenames
The following list of characters present issues in the listed area of the product. If you encounter issues, the following listings may provide some guidance on where the issue occurred.
Tip
You should avoid using any of these characters in your import filenames. This list may not be complete for all available running environments.
General:
"/"
Seb browser:
"\"
Excel filenames:
"#","{","}"
Spark-based running environment:
"{", "*", "\"
Native Input File Formats
AAC can read and import directly these file formats:
CSV
JSON
Note
AAC can read JSON natively but often requires additional work to properly structure into tabular format. Depending on how you've configured AAC (v1 or v2), JSON files might require conversion before they are available for use in the application.
Note
AAC requires that you submit JSON files with 1 valid JSON object per line. Consistently malformed JSON objects or objects that overlap linebreaks might cause import to fail.
Recommended limit of 1 GB in source file size. Since conversion happens within the Trifacta node, this limit might vary depending on the memory of the Trifacta node.
Each JSON record must be less than 20 MB in size.
Filename extensions must be
.json
or.JSON
.For best results, you should quote all keys and values and import them as strings.
You can escape quote values to treat them as literals in your strings using the backslash character (
\
).When you import the values into the Transformer page, AAC re-infers the data type for each column.
Plain Text
LOG
TSV
Parquet
Note
When working with datasets sourced from Parquet files, lineage information and the
$sourcerownumber
reference are not supported.
Avro
Note
When working with datasets sourced from Avro files, lineage information and the
$sourcerownumber
reference are not supported.Google Sheets
Note
Individual users must enable access to their Google Drive. No data other than Google Sheets is read from Google Drive.
Converted File Formats
Files of the following type are not read into the product in their native format. Instead, these file types are converted using the Conversion Service into a file format that is natively supported, stored in the base storage layer, and then ingested for use in the product.
Note
Compressed files that require conversion of the underlying file format are not supported for use in the product.
Converted file formats:
Excel (XLS/XLSX)
Note
Other Excel-related formats, such as XLSM format, are not supported. If you are encountering issues, try to Save As to XLS or XLSX from within the Microsoft Excel application.
Tip
You can import multiple sheets from a single workbook at a time. AAC supports up to 399 sheets.
Google Sheets
Tip
You may import multiple sheets from a single Google Sheet at one time.
PDF
JSON
Native Output File Formats
Designer Cloud can write to these file formats:
Note
Some output formats may need to be enabled by an administrator.
CSV
JSON
Hyper
Note
Publication of results in Hyper format may require additional configuration. See below.
Avro
Note
The Trifacta Photon and Spark running environments apply Snappy compression to this format.
Parquet
Note
The Trifacta Photon and Spark running environments apply Snappy compression to this format.
Compression Algorithms
When a file is imported, AAC attempts to infer the compression algorithm in use based on the filename extension. For example, .gz
files are assumed to be compressed with GZIP.
Note
Import of a compressed file whose underlying format requires conversion through the Conversion Service is not supported.
Read Native File Formats
GZIP | BZIP | Snappy | Notes | |
CSV | Supported | Supported | Supported | |
JSON v2 | Not supported | Not supported | Not supported | A converted file format. See above. |
JSON v1 | Supported | Supported | Supported | Not a converted file format. See above. |
Avro | Supported |
Write Native File Formats
GZIP | BZIP | Snappy | |
CSV | Supported | Supported | Supported |
JSON | Supported | Supported | Supported |
Avro | Supported; always on |
Snappy Compression Formats
Designer Cloud supports the following variants of Snappy compression format:
File extension | Format name | Notes |
---|---|---|
.sz | Framing2 format | See: https://github.com/google/snappy/blob/master/framing_format.txt |
.snappy | Hadoop-snappy format | See: https://code.google.com/p/hadoop-snappy/ Note Xerial's snappy-java format, which is also written with a |
Supported File Formats by Application
Individual applications may support a subset of the file formats and compression algorithms listed in this page.
Application | Description |
---|---|
Designer Experience | You can import a number of flat-file formats for use in Designer Experience. Go to Designer Cloud File Format Options. |
Trifacta Classic | All platform file formats and compression algorithms are supported. |
Reporting | Reporting uses data from your workflows as inputs. Go to Designer Experience. Reports can be exported in XLSX and PDF formats. Go to Reporting User Interface. |
Machine Learning | Uploaded data for model training or prediction must be a CSV file. For more information, Go to Problem Setup and Export and Predict. |