...
You can use dates in the Gregorian calendar system only. Dates in the Julian calendar are not supported.
Data Validation
When values are validated against the Datetime data type, the
D s webapp |
---|
However, some values may follow the regular expression validation pattern but are not accurate dates. For example, every four years, February 29 is a valid date. When this date is validated against the Datetime data type, it may be detected as a valid value, while the date is changed in the application to be incremented to a close accurate date, such as March 1 in this example.
Formatting Tokens
You can use the following tokens to change the format of a column of dates:
...
For more information on supported date formatting strings, see DATEFORMAT Function.
Supported Time Zones
For more information, see Supported Time Zone Values.
Job Execution
Datetime data typing involves the basic type definition, plus any supported formatting options. Depending on where the job is executed, there may be variation in how the Datetime data type is interpreted.
Some running environments may perform additional inference on the typing.
Info NOTE: During job execution on Spark, inputs of Datetime data type may result in row values being inferred for data type individually. For example, the String value
01/10/2020
may be inferred by date transformations as1st Oct, 2020
or10th Jan, 2020
. Resulting outputs of Datetime values may not be deterministic in this scenario.- Some formatting options may not be supported.
Differences between
D s photon |
---|
If your Datetime data does not contain time zone information, by default:
- Spark uses the time zone of the
for Datetime values.D s item item node
uses the UTC time zone for Datetime values.D s photon
This difference in how the values are treated can result in differences in Datetime-based calculations, such as the DATEDIF function.
Workarounds:
You can do one of the following:
- Set the time zone for the
to be UTC. You must also set the time zone for your Spark running environment to UTC.D s item item node Apply the following Spark property overrides:
Code Block "spark": "props": { ... "spark.driver.extraJavaOptions" : "-Duser.timezone=\"UTC\"", "spark.executor.extraJavaOptions" : "-Duser.timezone=\"UTC\"" } ... }
For more information, see Spark Execution Properties Settings.
...
Datetime Schema via API
When Datetime data is returned via API calls, the schema for this information is returned as a three-element array. The additional elements to the specific are required to account for formatting options of for Datetime values.
Tip |
---|
Tip: Schema information for data types is primarily available via API calls. You may find schema information for columns in JSON versions of the visual profile and flow definitions when they are exported. |
Example:
Code Block |
---|
"end_date": [
"Datetime",
"mm-dd-yy",
"mm*dd*yyyy"
] |
Array Element | Description | Example 1 | Example 2 |
---|---|---|---|
Data type | The internal name for the data type. For Datetime columns, this schema value should always be Datetime . | "Datetime" | "Datetime" |
Sub-format | The general format category of the data type | "mm-dd-yy" | "mm-dd-yy" |
Format type | The specific formatting for the data type | "mm*dd*yyyy" | "shortMonth*dd*yy" |