Page tree

 

Contents:


Before you import a packaged flow into a Production environment, you may need to apply import rules to remap objects and locations from the source instance to the new instance. Import mapping rules are not required when importing into the same environment, although they may be helpful in some cases.

Tip: If you are importing a flow that references file-based sources and wish to use the original files in your imported file, you may find it easier to configure the importing user's permissions to access the appropriate directories of the sources and then to swap datasets as needed after you complete the import. This method is suitable and easier to do across a fewer number of flows.

NOTE: Import mapping rules apply to deployments in a Production instance under deployment management. You cannot apply import mapping rules between two Dev instances.

NOTE: Import mapping rules require the use of the APIs made available from the Trifacta® platform. API usage is considered a developer-level skill.

  • For more information on creating an export package, see Export Flow.
  • For more information on how to import, see Import Flow.

You can apply the following types of remappings:

TypeDescription
Value

For value remappings, you can specify rules to match on specific values or patterns of values in the import package and remap those values for use in the new instance.

NOTE: In this release, value remapping is supported only for S3 bucket names and paths to imported datasets and output locations. Examples are provided below.

Object

For object remappings, you can specify rules to match a value listed in the import package and remap that value to a defined object in the new instance.

NOTE: In this release, object remapping is supported only for connections. An example is provided below.

Import Rules

When a flow is imported, references in the flow definition that apply in the source instance may not apply in the target instance. For example, the location paths to the source datasets may need to be rewritten to point to a different location in the target instance. 

Before you import your flow definition, you need to define rules for any value or object remapping that must be done in the target environment. 

Notes on import rules

  1. Value and object remapping rules should be completed before you import the flow. The flow may be non-functional until the rules are applied.

    Tip: After you create your import rules, you can perform via API a dry run of the import. Any errors are reported in the response. Details are provided below.


  2. Value and object remapping rules are applied at the time of import.  If you add new rules, they are not retroactively applied to release packages that have already been imported.
  3. When changing rules:
    1. Any previously applied rules to the same import object are deleted.
    2. You can apply multiple rules in the same change. 
    3. Rules are applied in the order in which they are listed in the request. Rules listed later in the request must be compatible with expected changes applied by the earlier rules. 
  4. Value and object remapping must be completed via API. API usage is considered a developer-level skill. Examples are provided below.

NOTE: Import mapping rules do not work for parameterized datasets. If the imported dataset with parameters is still accessible, you should be able to run jobs from it.

Import Rule Requirements

  • If you are importing into the same instance from which you exported (Dev/Test/Prod on the same instance):
    • Import rules are not required.
    • If you want to use a different source of data in your Prod flow, you must create import rules.
  • If you are importing into a different instance from which you exported (Dev and Prod on different instances):
    • Import rules are required, except in unusual cases.

Import Rule Types

The following types of rules can be applied to import mappings.

NOTE: Depending on the type of mapping, some of these rules may not apply. Please be sure to review the Examples below.

Object Mapping Types

TypeDescription
tableNameSet this value to connections. You must then specify the uuid of the connection identifier in the imported flow and replace it with the internal identifier of the connection in the importing instance.

Value Mapping Types

TypeDescription
fileLocation

This type is used to remap paths to files.

NOTE: filelocation rules apply to both input and output paths. Paths and their rules should be defined with care.

s3Bucket(AWS) Name of the S3 to remap.
dbTableName(relational source) Name of the table to remap.
dbPath(relational source) Path to the database table. This value is an array.
host

(Azure) Depending on the Azure datastore, this rule replaces:

  • WASB: blobhost name
  • ADLS Gen2: storage account
  • ADLS Gen1: datastore in the datalake
userinfo

(Azure) Depending on the Azure datastore, this rule replaces:

  • WASB: container name
  • ADLS Gen2: filesystem name

Examples

The following are some example import rules to address specific uses.

Example - Replace a connection

In this following example, you must remap the connection from the source instance of the platform to the corresponding connection in the instance where you are importing. 

First, you must be able to uniquely identify the connection from the source that you wish to remap. 

  • While the connection Id may work in a limited scope, that identifier is unlikely to be unique within your environment.
  • If you do know the connect Id from the source system, you can skip the first step below.

In the API response in a connection definition, you can acquire the uuid value for the connection, which is a unique identifier for the connection object across all instances of the platform:

Itemv4 APIs
API Endpoint

From the source instance:

/v4/connections
Method
GET
Request Body

None.

Response Body
{
    "data": [
        {
            "connectParams": {
                "vendor": "redshift",
                "vendorName": "redshift",
                "host": "redshift.example.com",
                "port": "5439",
                "extraLoadParams": "BLANKSASNULL EMPTYASNULL TRIMBLANKS TRUNCATECOLUMNS",
                "defaultDatabase": "test"
            },
            "id": 2,
            "host": "redshift.example.com",
            "port": 5439,
            "vendor": "redshift",
            "params": {
                "extraLoadParams": "BLANKSASNULL EMPTYASNULL TRIMBLANKS TRUNCATECOLUMNS",
                "defaultDatabase": "test"
            },
            "ssl": false,
            "vendorName": "redshift",
            "name": "redshift",
            "description": null,
            "type": "jdbc",
            "isGlobal": true,
            "credentialType": "custom",
            "credentialsShared": true,
            "uuid": "097c2300-2f6a-11e9-a585-57562e0d9cd6",
            "disableTypeInference": false,
            "createdAt": "2019-02-13T08:33:28.368Z",
            "updatedAt": "2019-02-13T08:33:28.381Z",
            "credentials": [
                {
                    "iamRoleArn": "arn:aws:iam:something",
                    "username": "UserName"
                }
            ],
            "creator": {
                "id": 1
            },
            "updater": {
                "id": 1
            },
            "workspace": {
                "id": 1
            }
        },
        {
            "connectParams": {
                "vendor": "hive",
                "vendorName": "hive",
                "host": "hadoop",
                "port": "10000",
                "jdbc": "hive2",
                "defaultDatabase": "default"
            },
            "id": 1,
            "host": "hadoop",
            "port": 10000,
            "vendor": "hive",
            "params": {
                "jdbc": "hive2",
                "connectStringOptions": "",
                "defaultDatabase": "default"
            },
            "ssl": false,
            "vendorName": "hive",
            "name": "hive",
            "description": null,
            "type": "jdbc",
            "isGlobal": true,
            "credentialType": "conf",
            "credentialsShared": true,
            "uuid": "08a1a180-2f6a-11e9-b2b2-85d2b0b67f5e",
            "disableTypeInference": false,
            "createdAt": "2019-02-13T08:33:26.936Z",
            "updatedAt": "2019-02-13T08:33:26.952Z",
            "credentials": [],
            "creator": {
                "id": 1
            },
            "updater": {
                "id": 1
            },
            "workspace": {
                "id": 1
            }
        }
    ],
    "count": 2
}
 
DocumentationSee API Connections Get v4.

In the above, you identify that the connection used for the exported flow is the Redshift one. This object has the following unique identifier:

"uuid": "097c2300-2f6a-11e9-a585-57562e0d9cd6"

In the target system, you must now create a rule in the deployment into which you are importing that searches for this unique value. In the following example:

  • The deploymentId is known to be 4.
  • The connectionId for the equivalent Redshift connection in the target system is 1.

The uuid field in the import package is searched for the matching string. If it is found, the connection in the import package is replaced with the connection in the target system with an Id of 1:

Itemv4 APIs
API Endpoint
/v4/deployments/4/objectImportRules
Method

PATCH

Request Body
[
  {
    "tableName": "connections",
    "onCondition": {
      "uuid": "097c2300-2f6a-11e9-a585-57562e0d9cd6"
    },
    "withCondition": {
      "id": 1
    }
  }
]
Status Code - Success200 - OK
Response Body

When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request.

In the following example, there were no rules, so nothing was deleted:

{
    "deleted": []
}
DocumentationSee API Deployments Object Import Rules Patch v4.

To test your rule, perform a dry run of the import. See below.

Example - Remap an HDFS location

In this example, your import rule must remap the path to the source from your Dev paths to your Prod paths. Suppose the pattern looks like this:

Dev Pathhdfs://datasets/dev/1/164e0bca-8c91-4e3c-9d0a-2a85eedec817/myData.csv
Prod Path hdfs://datasets/prod/1/164e0bca-8c91-4e3c-9d0a-2a85eedec817/myData-Prod.csv

Note the differences:

  • The /dev/ part of the path has been replaced by /prod/.
  • The filename is different.

You can use the following value import rules to change the path values. In the following example, the rules are applied separately.

NOTE: You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.

Request:

Itemv4 APIs
API Endpoint
/v4/deployments/4/valueImportRules
Method

PATCH

Request Body:


[
  {"type":"fileLocation","on":"/\/dev\//","with":"/prod/"},
  {"type":"fileLocation","on":"/\/([a-zA-Z0-9_]*).csv/","with":"$1-Prod.csv"}
]


Response:

Itemv4 APIs
Status Code - Success200 - OK
Response Body

When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request.

In the following example, there were no rules, so nothing was deleted:

{
    "deleted": []
}
DocumentationSee API Deployments Value Import Rules Patch v4.


To test your rule, perform a dry run of the import. See below.

Example - Remap an S3 location

For S3 sources, you can apply remapping rules including changing to a new S3 bucket. 

In this example, your import rule must remap the path to the source from your Dev paths to your Prod paths. Suppose the pattern looks like this:

Dev S3 Bucket Namewrangle-dev
Dev Path/projs/tweets/v04/tweets_month.csv
Prod S3 Bucket Namewrangle-prod
Prod Path /tweets/tweets_month.csv

You can use the following value import rules to change the bucket name and path values.

NOTE: You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.

s3Bucket name rule: This rule replaces the name of the S3 bucket to use with the new one: wrangle-prod.

fileLocation rule: This rule uses regular expressions to match each segment of the path in the source bucket's paths. 

  • Files are located at a consistent depth in the source bucket.
  • Path segments and filename use only alphanumeric values and underscores (_). 
  • The replacement path is shortened to contain only the parent name ($2) and the filename ($4) in the path.

Request:

Itemv4 APIs
API Endpoint
/v4/deployments/4/valueImportRules
Method

PATCH

Request Body:


[
  {"type":"s3Bucket","on":"wrangle-dev","with":"wrangle-prod"},
  {"type":"fileLocation","on":"/\/([a-zA-Z0-9_]*)\/([a-zA-Z0-9_]*)\/([a-zA-Z0-9_]*)\/([a-zA-Z0-9_]*).csv/","with":"/$2/$4.csv"}
]


Response:

Itemv4 APIs
Status Code - Success200 - OK
Response Body

When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request.

In the following example, there were no rules, so nothing was deleted:

{
    "deleted": []
}
DocumentationSee API Deployments Value Import Rules Patch v4.

To test your rule, perform a dry run of the import. See below.

Example - Remap a WASB location

For WASB sources, you can apply remapping rules during import. 

In this example, your import rule must remap the blob host, container, and file location:

Dev Blobhoststorage-wasb-account-dev.blob.core.windows.net
Dev Containercontainer-dev
Dev File Location/projs/work/orders.csv
Prod Blobhoststorage-wasb-account-prod.blob.core.windows.net
Prod Containercontainer-prod
Prod File Location/2003/transactions/orders.csv

You can use the following value import rules to change the blobhost, container, and file paths.

NOTE: You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.

host rule: This rule replaces the blobhost name to use with the new one: storage-wasb-account-prod.blob.core.windows.net.

userinfo rule: This rule replaces the container name to use with the new one: container-prod.

fileLocation rule: This rule performs a text substitution to replace the file path. This rule applies to both input and output object file paths.

Request:

Itemv4 APIs
API Endpoint
/v4/deployments/4/valueImportRules
Method

PATCH

Request Body:


[
  {"type":"host","on":"storage-wasb-account-dev.blob.core.windows.net","with":"storage-wasb-account-prod.blob.core.windows.net"},
  {"type":"userinfo","on":"container-dev","with":"container-prod"},
  {"type":"fileLocation","on":"/projs/work/orders.csv","with":"/2003/transactions/orders.csv"}
]


Response:

Itemv4 APIs
Status Code - Success200 - OK
Response Body

When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request.

In the following example, there were no rules, so nothing was deleted:

{
    "deleted": {
        "data": []
    }
}
Documentation

To test your rule, perform a dry run of the import. See below.

Example - Remap an ADLS Gen2 location

For ADLs Gen2 sources, you can apply remapping rules during import. 

In this example, your import rule must remap the storage account, filesystem, and file location:

Dev Storage Accountstorage-adlsgen2-account-dev.blob.core.windows.net
Dev Filesystemfilesystem-dev
Dev File Location/projs/work/orders.csv
Prod Storage Accountstorage-adlsgen2-account-prod.blob.core.windows.net
Prod Filesystemfilesystem-prod
Prod File Location/2003/transactions/orders.csv

You can use the following value import rules to change the storage account, filesystem, and file paths.

NOTE: You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.

host rule: This rule replaces the storage account name to use with the new one: storage-adlsgen2-account-prod.blob.core.windows.net.

userinfo rule: This rule replaces the filesystem name to use with the new one: filesystem-prod.

fileLocation rule: This rule performs a text substitution to replace the file path. This rule applies to both input and output object file paths.

Request:

Itemv4 APIs
API Endpoint
/v4/deployments/4/valueImportRules
Method

PATCH

Request Body:


[
  {"type":"host","on":"storage-adlsgen2-account-dev.blob.core.windows.net","with":"storage-adlsgen2-account-prod.blob.core.windows.net"},
  {"type":"userinfo","on":"filesystem-dev","with":"filesystem-prod"},
  {"type":"fileLocation","on":"/projs/work/orders.csv","with":"/2003/transactions/orders.csv"}
]


Response:

Itemv4 APIs
Status Code - Success200 - OK
Response Body

When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request.

In the following example, there were no rules, so nothing was deleted:

{
    "deleted": {
        "data": []
    }
}
Documentation

To test your rule, perform a dry run of the import. See below.

Example - Remap an ADLS Gen1 location

For ADLS Gen1 sources, you can apply remapping rules during import. 

In this example, your import rule must remap the Azure data lake store and file location:

Dev data storeadl://storage-adlsgen1-account.azuredatalakestore.net
Dev File Location/projs/work/orders.csv
Prod data storeadl://storage-adlsgen1-account-prod.azuredatalakestore.net
Prod File Location/2003/transactions/orders.csv

You can use the following value import rules to change the datastore and file paths.

NOTE: You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.

host rule: This rule replaces the datastore name to use with the new one: storage-adlsgen1-account-prod.azuredatalakestore.net.

fileLocation rule: This rule performs a text substitution to replace the file path. This rule applies to both input and output object file paths.

Request:

Itemv4 APIs
API Endpoint
/v4/deployments/4/valueImportRules
Method

PATCH

Request Body:


[
  {"type":"host","on":"storage-adlsgen1-account-dev.azuredatalakestore.net","with":"storage-adlsgen1-account-prod.azuredatalakestore.net"},
  {"type":"fileLocation","on":"/projs/work/orders.csv","with":"/2003/transactions/orders.csv"}
]


Response:

Itemv4 APIs
Status Code - Success200 - OK
Response Body

When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request.

In the following example, there were no rules, so nothing was deleted:

{
    "deleted": {
        "data": []
    }
}
Documentation

To test your rule, perform a dry run of the import. See below.

Example - Remap a relational datasource

When you migrate a relational source from a Dev instance to a Prod instance, you may need to remap your flow to use the production database and table.

NOTE: These rules can be applied to sources or publications of a flow.

In this example, you are replacing the input and output source databases and tables with the corresponding production DB values.

ItemDev valueProd value
Table name 1dev_transprod_trans
Path value 1dev_db2_srcprod_db2_src
Table name 2dev_trans_outprod_trans_out
Path value 2dev_db2_outprod_db2_out


In a single request, you can apply the rules changes to map the above Dev values to the Prod values.

NOTE: You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.

The on parameter accepts regular expressions. In the following example request, the on parameter has been configured to use a regular expression, under the assumption that all current and future imports will respect the current pattern or database paths and table names.

dbTableName rule: This rule replaces the name of the table to use.

dbPath rule: This rule replaces the path value to database table. 

NOTE: The content of a dataset or output dbPath is an array. The regular expression for on is applied to every element in the dbPath value. Typically, there's only one element in the dbPath array. In some cases, there may be multiple elements, so be careful when specifying the on value.

Request:

Itemv4 APIs
API Endpoint
/v4/deployments/4/valueImportRules
Method

PATCH

Request Body:


[
  {"type":"dbTableName","on":"/dev_([a-zA-Z0-9_]*)/","with":"prod_$1"},
  {"type":"dbPath","on":"/dev_([a-zA-Z0-9_]*)_src/","with":"prod_$1_out"}
]


Response:

Itemv4 APIs
Status Code - Success200 - OK
Response Body

When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request.

In the following example, there were no rules, so nothing was deleted:

{
    "deleted": []
}
DocumentationSee API Deployments Value Import Rules Patch v4.

To test your rule, perform a dry run of the import. See below.

Import Dry-Run

After you have specified a set of import rules, you can perform a dry-run of an import of an import package. This dry-run does not perform the actual import but does report any permissions errors or other issues in the response. 

In this example, the flow2import.zip file contains the package to import into deployment 4

Request:

Itemv4 APIs
API Endpoint
/v4/deployments/4/releases/dryRun
Method

POST

Request Body

In form data submitted with the request, you must include the following key-value pair:

KeyValue
data"@flow2import.zip"


Response:

Itemv4 APIs
Status Code - Success200 - OK
Response Body

The response body contains any import remapping rules that have been applied during the import process.

DocumentationSee API Releases Create DryRun v4.

After the above dry-run has been executed, the import package can be imported and is automatically connected to the appropriate connection. See API Releases Create v4.

This page has no comments.