Page tree

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 43 Next »

Trifacta Dataprep




Dataprep by Trifacta® enables you to rapidly transform disparate datasets of any size into usable data for the entire enterprise. Ingest, explore, and transform your data through a leading-edge interface, reducing the time to prepare your data from weeks to minutes.  Dataprep by Trifacta is integrated with the Google Cloud Platform and operated by partner  Trifacta.

Applicable Product Editions

These setup instructions apply to the following editions of the product:

NOTE: These product editions are licensed through the Google Marketplace from Trifacta. For more information on licensing or upgrading from Dataprep by Trifacta, please see the Google Marketplace listing.

  • Dataprep by Trifacta Enterprise Edition
  • Dataprep by Trifacta Professional Edition
  • Dataprep by Trifacta Starter Edition
  • Dataprep by Trifacta Premium
  • Dataprep by Trifacta Standard
  • Dataprep by Trifacta Legacy

    NOTE: If you are an existing Dataprep by Trifacta Legacy customer, you can use the Marketplace to upgrade to one of the supported Marketplace editions or to enable your current product edition for a new project. You can also choose to continue using Dataprep by Trifacta Legacy.

For more information, see Product Editions.

For more information on available plans, see

Support packages

Trifacta offers a range of support packages. For more information, please contact  Trifacta Support.


Before you begin, please review the following prerequisites.

NOTE: The name of the service account used by the product is provided by Google and cannot be modified.

NOTE:  If domain restricted sharing has been enabled as a policy in your enterprise, you must add a trust policy for the Trifacta GSuite domain. If you do not have the ID for this domain, it can be provided by Trifacta Support.

Set up a project

To use either product edition, you must have the following already set up in the Google Cloud Platform.

NOTE: If you are upgrading from Dataprep by Trifacta, you should already have these services enabled.

Create or set up a Google Cloud project. In the Cloud Console, on the project selector page, select or create a Cloud project.

Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

Go to the project selector page

Enable billing on that project. Please verify that billing is enabled for your Google Cloud project. Learn how to confirm billing is enabled for your project.

Enable services: In your project, enable the following services:

  1. Dataflow
  2. BigQuery 
  3. Cloud Storage APIs. See Enable the APIs.

Set up your storage bucket

On Base Storage, you must have a bucket set up for use with your project .

In the Cloud Console, navigate to the Cloud Storage Browser page. See

Click Create bucket.

In the Create bucket dialog, specify the following attributes:

  1. A unique bucket name. For more information on bucket name requirements, see
  2. A storage class. See
  3. A location where bucket data will be stored.

Click Create.

Set up your staging bucket

By default, when you enable  Dataprep by Trifacta in a project, a Base Storage staging bucket for Dataflow use is automatically created for you in a U.S. region. This staging bucket is used for staging assets for use on Dataflow jobs and is required for use with the product. If you do have permissions to create a storage bucket in the U.S, you do not need to create a storage bucket for staging and can skip to the next section. 

NOTE: If you do not have permissions to create a Base Storage bucket in the U.S., you must your own staging bucket before enabling Dataprep by Trifacta in your project. This name of this bucket must begin with the following text string: dataprep-staging- followed by an identifying value.

A bucket can be created from:

Google Console:

Google CLI:

The staging bucket can be changed:

  • During enablement of the product in a project. You can select a different staging bucket as needed. 
  • After the product has been enabled, individual users can configure the bucket to use for staging of their assets. See User Profile Page.

Whitelist the IP address range of the Trifacta Service

If you are connecting to relational sources, you must whitelist the IP address range of the Trifacta service for your database instances.  The IP address range of the  Trifacta service  are the following:

NOTE: On the database server for each relational source type (Oracle, SQL Server, etc.), you must whitelist these IP addresses.

NOTE: Relational datasources must be available on a public IP address that is accessible from the product.

Tip: To verify that you have whitelisted the IP address range appropriately, you can create a connection of the relational connection type from inside the Trifacta application. This step is described later.

For more information, please contact  Trifacta Support .

Purchase and enable through the Google Marketplace

After you have completed the above steps, please proceed through the Google Marketplace to complete your purchase. Your purchase covers:

  • Basic entitlement
  • Licensing for each Google Cloud projects

NOTE: Changes to your billing information for a project have been known to cause forced cancellation of Dataprep by Trifacta in a project. If possible, please confirm that the billing information is properly set for the project before you enable the product on it. If you encounter a forced cancellation, please contact Google Support. For more information, see Enable or Disable Dataprep.

For more information, see .


After the product has been licensed for your project, please complete the following steps for your account.

Required permissions

When  Dataprep by Trifacta is enabled for your project, the Dataprep.user role is automatically assigned to each permitted user of the project. For basic access, no additional permissions are required.

NOTE: Depending on the permissions scheme in your enterprise, you may need to enable additional permissions to access features of the product or services in the Google Cloud Platform.

For more information, see Required Dataprep User Permissions.

Additional permissions

Some product editions require special permissions to use the project. For more information, see Create IAM Role for Dataprep.

Enable in the project

NOTE: Dataprep by Trifacta must be enabled in individual projects by the project owner.

  1. In the Google Cloud Console, select the project in which you wish to enable  Dataprep by Trifacta.
  2. Open the product. See
  3. As the project owner, you must enable access to project data for Google and  Trifacta


Each user of the project must do the following:

  1. In the Google Cloud Console, select the project in which you wish to enable  Dataprep by Trifacta.
  2. Open the product. See
  3. Accept the terms of service. 
  4. Select a Base Storage bucket to use with the product. For more information, see Enable or Disable Dataprep.
  5. The Trifacta application is displayed.
  6. The first time you login, you can immediately upload a dataset and begin transforming it. For more information, see Import Basics.
  7. On subsequent logins, the Home page is displayed:

Figure: Home page

Project settings

The project owner should review the settings for your project. See Dataprep Project Settings Page.

Set up directories

Each user must configure the directories on Base Storage for use with the product. You can change the directories that are used for uploads, job runs, and temp storage. 

  1. In the left nav bar, select the User icon. 
  2. In the User menu, select Preferences.
  3. The User Profile page is displayed. 
  4. As needed, you can change the Upload, Job Run, and Temp directories in your bucket. To save your changes, click Done

For more information, see User Profile Page.


If you have completed the above steps, you should verify operations.

Verify operations

Before inviting other users, you should run a simple job through the product. 

Prepare Your Sample Dataset

To complete this test, you should locate or create a simple dataset. Your dataset should be created in the format that you wish to test.

Tip: The simplest way to test is to create a two-column CSV file with at least 25 non-empty rows of data. This data can be uploaded through the application.


  • Two or more columns. 
  • If there are specific data types that you would like to test, please be sure to include them in the dataset.
  • A minimum of 25 rows is required for best results of type inference.
  • Ideally, your dataset is a single file or sheet. 

Store Your Dataset

If you are testing an integration, you should store your dataset in the datastore with which the product is integrated.

Tip: Uploading datasets is always available as a means of importing datasets.


  • You may need to create a connection between the platform and the datastore.
  • Read and write permissions must be enabled for the connecting user to the datastore.
  • For more information, see Connections Page.

Verification Steps


  1. Login to the application.

  2. In the application menu bar, click Library.

  3. Click Import Data. See Import Data Page.
    1. Select the connection where the dataset is stored. For datasets stored on your local desktop, click Upload.
    2. Select the dataset.
    3. Click Continue.
  4. The initial sample of the dataset is opened in the Transformer page, where you can edit your recipe to transform the dataset.
    1. In the Transformer page, some steps are automatically added to the recipe for you. So, you can run the job immediately.
    2. You can add additional steps if desired. See Transformer Page.
  5. Click Run
    1. If options are presented, select the defaults.

    2. To generate results in other formats or output locations, click Add Publishing Destination. Configure the output formats and locations. 
    3. To test dataset profiling, click the Profile Results checkbox. Note that profiling runs as a separate job and may take considerably longer. 
    4. See Run Job Page.

  6. When the job completes, you should see a success message under the Jobs tab in the Flow View page. 
    1. Troubleshooting: Either the Transform job or the Profiling job may break. To localize the problem, try re-running a job by deselecting the broken job type or running the job on a different running environment (if available). You can also download the log files to try to identify the problem. See Job Details Page.
  7. Click View Results from the context menu for the job listing. In the Job Details page, you can see a visual profile of the generated results. See Job Details Page.
  8. In the Output Destinations tab, click a link to download the results to your local desktop. 
  9. Load these results into a local application to verify that the content looks ok.

Checkpoint: You have verified importing from the selected datastore and transforming a dataset. If your job was successfully executed, you have verified that the product is connected to the job running environment and can write results to the defined output location. Optionally, you may have tested profiling of job results. If all of the above tasks completed, the product is operational end-to-end.

Verify IP address whitelisting

If you have whitelisted the Trifacta service IP addresses for your database server, you can create a connection to the database from inside the Trifacta application. If you are able to successfully read data into the application from your database, then the whitelist has been specified correctly.

NOTE: The database to which you are connecting must be available from the Trifacta service over the public Internet.

For more information, see Connection Types.

Invite Users

You can invite other people to join your project at this time.

NOTE: First-time users of the product should access  Dataprep by Trifacta by invitation only. Do not provide direct URLs to first-time users.

For more information, see

Example Flows

When a new workspace is created, the first user is provided a set of example flows. These flows are intended to teach by example and illustrate many recommended practices for building your own flows. For more information on example flows, see Workflow Basics.


The following resources can assist users in getting started with wrangling.

Tip: Check out the product walkthrough available through in-app chat! This walkthrough steps through each phase of ingesting, transforming, and generating results for your data.

For a quick start of  Dataprep by Trifacta products, see Quickstart for Dataprep.

Check out the  Trifacta Community

For a basic summary of each step of the wrangling process, see  Workflow Basics .

Access documentation: To access the full customer documentation, from the left nav bar, select Help menu > Documentation.

Additional Setup

Depending on your environment, the following additional configuration steps may be required. 

  • No labels

This page has no comments.