After you have applied a configuration change to the platform and restarted, you can use the following steps to verify that the platform is working correctly.
If your configuration change was applied to
trifacta-conf.json, you should restart the platform before continuing. See Start and Stop the Platform.
Prepare Your Sample Dataset
To complete this test, you should locate or create a simple dataset. Your dataset should be created in the format that you wish to test.
- Two or more columns.
- If there are specific data types that you would like to test, please be sure to include them in the dataset.
- A minimum of 25 rows is required for best results of type inference.
- Ideally, your dataset is a single file or sheet.
Store Your Dataset
If you are testing an integration, you should store your dataset in the datastore with which the product is integrated.
Tip: Uploading datasets is always available as a means of importing datasets.
- You may need to create a connection between the platform and the datastore.
- Read and write permissions must be enabled for the connecting user to the datastore.
- For more information, see Connections Page.
- Login to the application. See Login.
- In the application menu bar, click Library.
- Click Import Data. See Import Data Page.
- Select the connection where the dataset is stored. For datasets stored on your local desktop, click Upload.
- Select the dataset.
- In the right panel, click the Add Dataset to a Flow checkbox. Enter a name for the new flow.
- Click Import and Add to Flow.
Troubleshooting: At this point, you have read access to your datastore from the platform. If not, please check the logs, permissions, and your Trifacta® configuration.
- In the left menu bar, click the Flows icon. Flows page, open the flow you just created. See Flows Page.
- In the Flows page, click the dataset you just imported. Click Add new Recipe.
- Select the recipe. Click Edit Recipe.
- The initial sample of the dataset is opened in the Transformer page, where you can edit your recipe to transform the dataset.
- In the Transformer page, some steps are automatically added to the recipe for you. So, you can run the job immediately.
- You can add additional steps if desired. See Transformer Page.
- Click Run Job.
If options are presented, select the defaults.
- To generate results in other formats or output locations, click Add Publishing Destination. Configure the output formats and locations.
- To test dataset profiling, click the Profile Results checkbox. Note that profiling runs as a separate job and may take considerably longer.
- See Run Job Page.
Troubleshooting: Later, you can re-run this job on a different running environment. Some formats are not available across all running environments.
- When the job completes, you should see a success message under the Jobs tab in the Flow View page.
- Troubleshooting: Either the Transform job or the Profiling job may break. To localize the problem, try re-running a job by deselecting the broken job type or running the job on a different running environment (if available). You can also download the log files to try to identify the problem. See Job Details Page.
- Click View Results from the context menu for the job listing. In the Job Details page, you can see a visual profile of the generated results. See Job Details Page.
- In the Output Destinations tab, click a link to download the results to your local desktop.
- Load these results into a local application to verify that the content looks ok.
Checkpoint: You have verified importing from the selected datastore and transforming a dataset. If your job was successfully executed, you have verified that the product is connected to the job running environment and can write results to the defined output location. Optionally, you may have tested profiling of job results. If all of the above tasks completed, the product is operational end-to-end.
This page has no comments.