...
Parameter | Description | Applicable CLI Commands | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
command_type | The type of CLI command to execute. Accepted values:
See Examples below. For more information on the following commands, see CLI for Connections.
| All | |||||||||||
user_name | (Required)
| All | |||||||||||
password |
If no password is specified, you are prompted to enter one.
| All | |||||||||||
cli_output_path | Defines the client-side path where the JSON output is stored for all commands. Default value is
| All | |||||||||||
disable_ssl_certification | (Optional) When communicating over HTTPS, this setting can be used to override the default behavior of validating the server certificate before executing the command.
| All commands | |||||||||||
conn_ssl | (Optional) Connect to the datastore over SSL.
| All commands |
...
You execute one publish command for each output you wish to write to a supported database table. A new database table is created every run.
Command
Example (All one command):
Code Block |
---|
./trifacta_cli.py publish --user_name <trifacta_user> --password <trifacta_password> --job_id 42 --database dev --table table_job_42 --conn_name 1 --publish_format avro --cli_output_path ./publish_info.out |
Output
Code Block |
---|
PublishCreate new table every run has been successfully launched: You may monitor the progress of your publish job here: http://localhost:3005/jobs Upon success, you may view the results of your publish job here: http://localhost:3005/jobs/42 |
...
You can load data into pre-existing Redshift tables.
- Data is appended after any existing rows.
If the table does not exist, the job fails.
Info |
---|
NOTE: When appending data into a Redshift table, the columns displayed in the Transformer page must match the order and data type of the columns in the target table. |
...
Code Block |
---|
./trifacta_cli.py load_data --user_name <trifacta_user> --password <trifacta_password> --job_id 42 --database dev --table table_42 --conn_name aSQLServerConnection --publish_format avro --cli_output_path ./load_info.out |
Output
Code Block |
---|
Load data/AppendAppend to this table every run has been successfully launched: You may monitor the progress of your publish job here: http://localhost:3005/jobs Upon success, you may view the results of your Load data/Append job here: http://localhost:3005/jobs/42 |
...
Truncate and load
For existing Hive tables, you can clear them and load them with results from a job. You cannot truncate and load into Redshift tables.
Command
Example (All one command):
...