Page tree

Release 6.8.2


Contents:

   

Contents:


NOTE: The Alteryx® data types listed in this page reflect the raw data type of the converted column. Depending on the contents of the column, the Transformer Page may re-infer a different data type, when a dataset using this type of source is loaded.

Access/Read

When a Databricks Tables data type is imported, its JDBC data type is remapped according to the following table.

Tip: Data precision may be lost during conversion. You may want to generate min and max values and compute significant digits for values in your Hive tables and then compute the same in the Designer Cloud application.

Source Data TypeSupported?
Alteryx Data Type
Notes
arrayYArray
bigintYInteger

NOTE: The Designer Cloud Powered by Trifacta platform may infer bigint columns containing very large or very small values as String data type.

binaryYString
booleanYBool
charYString
dateYDatetime
decimalYDecimal
doubleYDecimal
floatYDecimal

NOTE: On import, some float columns may be interpreted as Integer data type in the Designer Cloud Powered by Trifacta platform. To fix, you can explicitly set the column's data type to Decimal in the Transformer page.

intYInteger
mapYObject
smallintYInteger
stringYString
structYObject
timestampYDatetime
tinyintYInteger
uniontypeN

varcharYString

Write/Publish

Create new table

Alteryx Data Type
Databricks Tables Data TypeNotes
Stringstring
Integerbigint

NOTE: The Designer Cloud Powered by Trifacta platform may infer Integer columns containing very large or very small values as String data type. Before you publish, you should verify that your columns containing extreme values are interpreted as Integer type. You can import a target schema to assist in lining up your columns with the expected target. For more information, see Overview of RapidTarget.

Decimaldouble
Boolboolean
DatetimeTimestamp/string (see Notes on Datetime columns below)Target data type is based on the underlying data. Time zone information is retained.
Objectstring
Arraystring

Append to existing table

If you are publishing to a pre-existing table, the following data type conversions apply:

  • Columns: Alteryx data types
  • Rows: Target table data types

In any table cell, a Y indicates that the append operation for that data type mapping is supported.

NOTE: You cannot append to Databricks Tables map and array column types from Alteryx columns of Map and Array type, even if you imported data from this source.


StringIntegerDatetimeBoolDecimalMapArrayOut of Range error
CHARYYYYYYY
VARCHARYYYYYYY
STRINGYYYYYYY
INT
Y




NULL
BIGINT
Y




n/a
TINYINT






NULL
SMALLINT






NULL
DECIMAL
Y

Y

NULL
DOUBLE
Y

Y

n/a
FLOAT



Y 

NULL
TIMESTAMP

Y




BOOLEAN


Y



Notes on Datetime columns

Run Job

Columns in new tables created for output of Datetime columns are written with the Databricks Tables timestamp data type. These columns can be appended.

A single job cannot write Datetime values to one table as String type and to another table as Timestamp type. This type of job should be split into multiple types. The table schemas may require modification.

  • The above issue may appear as the following error when executing the job:

    Unable to publish due to datetime data type conflict in column XXXX

This page has no comments.