- You cannot publish ad-hoc results for a job when another publishing job is in progress for the same job through the application. Please wait until the previous job has been published before retrying to publish the failing job. This is a known issue.
- If you run a job and then attempt to export the results to a relational source, Datetime columns are written in the relational table as String values. Direct publication of Datetime columns publishes the output in the designated target data type. For more information, see Type Conversions.
- If you run a job with a single relational target and it fails at the publication step, you cannot publish the transformation job through the Export Results window.
JSON-formatted files that are generated by Trifacta Wrangler Enterprise are rendered in JSON Lines format, which is a single line per-record variant of JSON. For more information, see http://jsonlines.org.
Publish to Tableau Server
If you have created a Tableau Server connection, you can export results that have been generated to the connected server.
Hyper: Results are written to your Tableau Server in Hyper format.
- Connection: If you have created multiple connections to Tableau Server, please select the connection to use from the list.
- The Site name is specified as part of the connection. See Create Tableau Server Connections.
- Project Name: Name of the Tableau Server project.
- Datasource Name: Name of the Tableau Server datasource. This value is displayed for selection in Tableau Server.
If you are publishing to a pre-existing table, schema validation is automatically performed.
- Create new datasource: The platform creates the datasource and then loads it with the results from this job. If you attempt to use this option on a source that already exists, the publishing job fails, and an error is generated in the log.
Append data to existing datasource: The results from this job are appended to the data that is already stored in Tableau Server. If you attempt to append to a source that does not exist, the publishing job fails, and an error is generated in the log. Append operations also fail if you publish to a target with a different schema.
- Replace contents of existing datasource: Target datasource is dropped. A new datasource is created using the schema of the generated output and filled with the job results.
Troubleshooting - Request timeout exception
When publishing to Tableau Server, you may encounter an error similar to the following for a PUT operation in the job log:
In this case, the size of individual chunks submitted to Tableau Server is too large. The PUT operation did not complete before a server timeout was encountered, and the operation failed.
To address this issue, you should lower the size of each chunk that is submitted to Tableau Server for publication. For more information, see Configure Data Service.
This page has no comments.