API Task - Define Deployment Import Mappings
Before you import a packaged flow into a Production environment, you may need to apply import rules to remap objects and locations from the source instance to the new instance. Import mapping rules are not required when importing into the same environment, although they may be helpful in some cases.
Tip
If you are importing a flow that references file-based sources and wish to use the original files in your imported file, you may find it easier to configure the importing user's permissions to access the appropriate directories of the sources and then to swap datasets as needed after you complete the import. This method is suitable and easier to do across a fewer number of flows.
Note
Import mapping rules apply to deployments in a Production instance under deployment management. You cannot apply import mapping rules between two Dev instances.
Note
Import mapping rules do not apply to any SQL referenced in your flows or datasets. In addition to remapping the connection identifiers, you should store your database names and table names as environment parameters, which apply to the entire workspace. These variables can be exported from one workspace and imported into another, where they can be updated to the correct value for the workspace. For more information, see Overview of Parameterization.
Note
Import mapping rules require the use of the APIs made available from the Designer Cloud Powered by Trifacta platform. API usage is considered a developer-level skill.
For more information on creating an export package, see Export Flow.
For more information on how to import, see Import Flow.
You can apply the following types of remappings:
Type | Description |
---|---|
Value | For value remappings, you can specify rules to match on specific values or patterns of values in the import package and remap those values for use in the new instance. Note In this release, value remapping is supported only for S3 bucket names and paths to imported datasets and output locations. Examples are provided below. |
Object | For object remappings, you can specify rules to match a value listed in the import package and remap that value to a defined object in the new instance. Note In this release, object remapping is supported only for connections. An example is provided below. |
Import Rules
When a flow is imported, references in the flow definition that apply in the source instance may not apply in the target instance. For example, the location paths to the source datasets may need to be rewritten to point to a different location in the target instance.
Before you import your flow definition, you need to define rules for any value or object remapping that must be done in the target environment.
Notes on import rules
Value and object remapping rules should be completed before you import the flow. The flow may be non-functional until the rules are applied.
Tip
After you create your import rules, you can perform via API a dry run of the import. Any errors are reported in the response. Details are provided below.
Value and object remapping rules are applied at the time of import. If you add new rules, they are not retroactively applied to release packages that have already been imported.
When changing rules:
Any previously applied rules to the same import object are deleted.
You can apply multiple rules in the same change.
Rules are applied in the order in which they are listed in the request. Rules listed later in the request must be compatible with expected changes applied by the earlier rules.
Value and object remapping must be completed via API. API usage is considered a developer-level skill. Examples are provided below.
Note
Import mapping rules do not work for parameterized datasets. If the imported dataset with parameters is still accessible, you should be able to run jobs from it.
Import Rule Requirements
If you are importing into the same instance from which you exported (Dev/Test/Prod on the same instance):
Import rules are not required.
If you want to use a different source of data in your Prod flow, you must create import rules.
If you are importing into a different instance from which you exported (Dev and Prod on different instances):
Import rules are required, except in unusual cases.
Import Rule Types
The following types of rules can be applied to import mappings.
Note
Depending on the type of mapping, some of these rules may not apply. Please be sure to review the Examples below.
Object Mapping Types
Type | Description |
---|---|
tableName | Set this value to |
Value Mapping Types
Type | Description |
---|---|
fileLocation | This type is used to remap paths to files. Note
|
s3Bucket | (AWS) Name of the S3 to remap. |
dbTableName | (relational source) Name of the table to remap. |
dbPath | (relational source) Path to the database table. This value is an array. |
host | (Azure) Depending on the Azure datastore, this rule replaces:
|
userinfo | (Azure) Depending on the Azure datastore, this rule replaces:
|
Examples
The following are some example import rules to address specific uses.
Example - Replace a connection
In this following example, you must remap the connection from the source instance of the platform to the corresponding connection in the instance where you are importing.
First, you must be able to uniquely identify the connection from the source that you wish to remap.
While the connection Id may work in a limited scope, that identifier is unlikely to be unique within your environment.
If you do know the connect Id from the source system, you can skip the first step below.
In the API response in a connection definition, you can acquire the uuid
value for the connection, which is a unique identifier for the connection object across all instances of the platform:
Item | v4 APIs |
---|---|
API Endpoint | From the source instance: /v4/connections |
Method | GET |
Request Body | None. |
Response Body | { "data": [ { "connectParams": { "vendor": "redshift", "vendorName": "redshift", "host": "redshift.example.com", "port": "5439", "extraLoadParams": "BLANKSASNULL EMPTYASNULL TRIMBLANKS TRUNCATECOLUMNS", "defaultDatabase": "test" }, "id": 2, "host": "redshift.example.com", "port": 5439, "vendor": "redshift", "params": { "extraLoadParams": "BLANKSASNULL EMPTYASNULL TRIMBLANKS TRUNCATECOLUMNS", "defaultDatabase": "test" }, "ssl": false, "vendorName": "redshift", "name": "redshift", "description": null, "type": "jdbc", "isGlobal": true, "credentialType": "iamRoleArn", "credentialsShared": true, "uuid": "097c2300-2f6a-11e9-a585-57562e0d9cd6", "disableTypeInference": false, "createdAt": "2019-02-13T08:33:28.368Z", "updatedAt": "2019-02-13T08:33:28.381Z", "credentials": [ { "iamRoleArn": "arn:aws:iam:something", "username": "UserName" } ], "creator": { "id": 1 }, "updater": { "id": 1 }, "workspace": { "id": 1 } }, { "connectParams": { "vendor": "hive", "vendorName": "hive", "host": "hadoop", "port": "10000", "jdbc": "hive2", "defaultDatabase": "default" }, "id": 1, "host": "hadoop", "port": 10000, "vendor": "hive", "params": { "jdbc": "hive2", "connectStringOptions": "", "defaultDatabase": "default" }, "ssl": false, "vendorName": "hive", "name": "hive", "description": null, "type": "jdbc", "isGlobal": true, "credentialType": "conf", "credentialsShared": true, "uuid": "08a1a180-2f6a-11e9-b2b2-85d2b0b67f5e", "disableTypeInference": false, "createdAt": "2019-02-13T08:33:26.936Z", "updatedAt": "2019-02-13T08:33:26.952Z", "credentials": [], "creator": { "id": 1 }, "updater": { "id": 1 }, "workspace": { "id": 1 } } ], "count": 2 } |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/getConnection |
In the above, you identify that the connection used for the exported flow is the Redshift one. This object has the following unique identifier:
"uuid": "097c2300-2f6a-11e9-a585-57562e0d9cd6"
In the target system, you must now create a rule in the deployment into which you are importing that searches for this unique value. In the following example:
The deploymentId is known to be
4
.The connectionId for the equivalent Redshift connection in the target system is
1
.
The uuid
field in the import package is searched for the matching string. If it is found, the connection in the import package is replaced with the connection in the target system with an Id of 1
:
Item | v4 APIs |
---|---|
API Endpoint | /v4/deployments/4/objectImportRules |
Method |
|
Request Body | [ { "tableName": "connections", "onCondition": { "uuid": "097c2300-2f6a-11e9-a585-57562e0d9cd6" }, "withCondition": { "id": 1 } } ] |
Status Code - Success | 200 - OK |
Response Body | When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request. In the following example, there were no rules, so nothing was deleted: { "deleted": { "data": [] } } |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/updateObjectImportRules |
To test your rule, perform a dry run of the import. See below.
Example - Remap an HDFS location
In this example, your import rule must remap the path to the source from your Dev paths to your Prod paths. Suppose the pattern looks like this:
Dev Path | hdfs://datasets/dev/1/164e0bca-8c91-4e3c-9d0a-2a85eedec817/myData.csv |
---|---|
Prod Path |
|
Note the differences:
The
/dev/
part of the path has been replaced by/prod/
.The filename is different.
You can use the following value import rules to change the path values. In the following example, the rules are applied separately.
Note
You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.
Request:
Item | v4 APIs |
---|---|
API Endpoint | /v4/deployments/4/valueImportRules |
Method |
|
Request Body:
[ {"type":"fileLocation","on":"/\/dev\//","with":"/prod/"}, {"type":"fileLocation","on":"/\/([a-zA-Z0-9_]*).csv/","with":"$1-Prod.csv"} ]
Response:
Item | v4 APIs |
---|---|
Status Code - Success | 200 - OK |
Response Body | When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request. In the following example, there were no rules, so nothing was deleted: { "deleted": { "data": [] } } |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/updateValueImportRules |
To test your rule, perform a dry run of the import. See below.
Example - Remap an S3 location
For S3 sources, you can apply remapping rules including changing to a new S3 bucket.
In this example, your import rule must remap the path to the source from your Dev paths to your Prod paths. Suppose the pattern looks like this:
Dev S3 Bucket Name | wrangle-dev |
---|---|
Dev Path | /projs/tweets/v04/tweets_month.csv |
Prod S3 Bucket Name | wrangle-prod |
Prod Path |
|
You can use the following value import rules to change the bucket name and path values.
Note
You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.
s3Bucket name rule: This rule replaces the name of the S3 bucket to use with the new one: wrangle-prod
.
fileLocation rule: This rule uses regular expressions to match each segment of the path in the bucket's paths.
Files are located at a consistent depth in the source bucket.
Path segments and filename use only alphanumeric values and underscores (_).
The replacement path is shortened to contain only the parent name ($2) and the filename ($4) in the path.
This rule applies to both input and output object file paths.
Request:
Item | v4 APIs |
---|---|
API Endpoint | /v4/deployments/4/valueImportRules |
Method |
|
Request Body:
[ {"type":"s3Bucket","on":"wrangle-dev","with":"wrangle-prod"}, {"type":"fileLocation","on":"/\/([a-zA-Z0-9_]*)\/([a-zA-Z0-9_]*)\/([a-zA-Z0-9_]*)\/([a-zA-Z0-9_]*).csv/","with":"/$2/$4.csv"} ]
Response:
Item | v4 APIs |
---|---|
Status Code - Success | 200 - OK |
Response Body | When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request. In the following example, there were no rules, so nothing was deleted: { "deleted": { "data": [] } } |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/updateValueImportRules |
To test your rule, perform a dry run of the import. See below.
Example - Remap a WASB location
For WASB sources, you can apply remapping rules during import.
In this example, your import rule must remap the blob host, container, and file location:
Dev Blobhost | storage-wasb-account-dev.blob.core.windows.net |
---|---|
Dev Container | container-dev |
Dev File Location | /projs/work/orders.csv |
Prod Blobhost | storage-wasb-account-prod.blob.core.windows.net |
Prod Container |
|
Prod File Location | /2003/transactions/orders.csv |
You can use the following value import rules to change the blobhost, container, and file paths.
Note
You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.
host rule: This rule replaces the blobhost name to use with the new one:storage-wasb-account-prod.blob.core.windows.net
.
userinfo rule: This rule replaces the container name to use with the new one: container-prod
.
fileLocation rule: This rule performs a text substitution to replace the file path. This rule applies to both input and output object file paths.
Request:
Item | v4 APIs |
---|---|
API Endpoint | /v4/deployments/4/valueImportRules |
Method |
|
Request Body:
[ {"type":"host","on":"storage-wasb-account-dev.blob.core.windows.net","with":"storage-wasb-account-prod.blob.core.windows.net"}, {"type":"userinfo","on":"container-dev","with":"container-prod"}, {"type":"fileLocation","on":"/projs/work/orders.csv","with":"/2003/transactions/orders.csv"} ]
Response:
Item | v4 APIs |
---|---|
Status Code - Success | 200 - OK |
Response Body | When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request. In the following example, there were no rules, so nothing was deleted: { "deleted": { "data": [] } } |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/updateValueImportRules |
To test your rule, perform a dry run of the import. See below.
Example - Remap an ADLS Gen2 location
For ADLs Gen2 sources, you can apply remapping rules during import.
In this example, your import rule must remap the storage account, filesystem, and file location:
Dev Storage Account | storage-adlsgen2-account-dev.blob.core.windows.net |
---|---|
Dev Filesystem | filesystem-dev |
Dev File Location | /projs/work/orders.csv |
Prod Storage Account | storage-adlsgen2-account-prod.blob.core.windows.net |
Prod Filesystem | filesystem-prod |
Prod File Location | /2003/transactions/orders.csv |
You can use the following value import rules to change the storage account, filesystem, and file paths.
Note
You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.
host rule: This rule replaces the storage account name to use with the new one: storage-adlsgen2-account-prod.blob.core.windows.net
.
userinfo rule: This rule replaces the filesystem name to use with the new one: filesystem-prod
.
fileLocation rule: This rule performs a text substitution to replace the file path. This rule applies to both input and output object file paths.
Request:
Item | v4 APIs |
---|---|
API Endpoint | /v4/deployments/4/valueImportRules |
Method |
|
Request Body:
[ {"type":"host","on":"storage-adlsgen2-account-dev.blob.core.windows.net","with":"storage-adlsgen2-account-prod.blob.core.windows.net"}, {"type":"userinfo","on":"filesystem-dev","with":"filesystem-prod"}, {"type":"fileLocation","on":"/projs/work/orders.csv","with":"/2003/transactions/orders.csv"} ]
Response:
Item | v4 APIs |
---|---|
Status Code - Success | 200 - OK |
Response Body | When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request. In the following example, there were no rules, so nothing was deleted: { "deleted": { "data": [] } } |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/updateValueImportRules |
To test your rule, perform a dry run of the import. See below.
Example - Remap an ADLS Gen1 location
For ADLS Gen1 sources, you can apply remapping rules during import.
In this example, your import rule must remap the Azure data lake store and file location:
Dev data store | adl://storage-adlsgen1-account.azuredatalakestore.net |
---|---|
Dev File Location | /projs/work/orders.csv |
Prod data store | adl://storage-adlsgen1-account-prod.azuredatalakestore.net |
Prod File Location | /2003/transactions/orders.csv |
You can use the following value import rules to change the datastore and file paths.
Note
You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.
host rule: This rule replaces the datastore name to use with the new one: storage-adlsgen1-account-prod.azuredatalakestore.net
.
fileLocation rule: This rule performs a text substitution to replace the file path. This rule applies to both input and output object file paths.
Request:
Item | v4 APIs |
---|---|
API Endpoint | /v4/deployments/4/valueImportRules |
Method |
|
Request Body:
[ {"type":"host","on":"storage-adlsgen1-account-dev.azuredatalakestore.net","with":"storage-adlsgen1-account-prod.azuredatalakestore.net"}, {"type":"fileLocation","on":"/projs/work/orders.csv","with":"/2003/transactions/orders.csv"} ]
Response:
Item | v4 APIs |
---|---|
Status Code - Success | 200 - OK |
Response Body | When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request. In the following example, there were no rules, so nothing was deleted: { "deleted": { "data": [] } } |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/updateValueImportRules |
To test your rule, perform a dry run of the import. See below.
Example - Remap a relational datasource
When you migrate a relational source from a Dev instance to a Prod instance, you may need to remap your flow to use the production database and table.
Note
These rules can be applied to sources or publications of a flow.
In this example, you are replacing the input and output source databases and tables with the corresponding production DB values.
Item | Dev value | Prod value |
---|---|---|
Table name 1 | dev_trans | prod_trans |
Path value 1 | dev_db2_src | prod_db2_src |
Table name 2 | dev_trans_out | prod_trans_out |
Path value 2 | dev_db2_out | prod_db2_out |
In a single request, you can apply the rules changes to map the above Dev values to the Prod values.
Note
You can specify multiple rules in a single request. Rules are applied in the order that they are listed. Latter rules must factor the results of earlier rules.
The on
parameter accepts regular expressions. In the following example request, the on
parameter has been configured to use a regular expression, under the assumption that all current and future imports will respect the current pattern or database paths and table names.
dbTableName rule: This rule replaces the name of the table to use.
dbPath rule: This rule replaces the path value to database table.
Note
The content of a dataset or output dbPath
is an array. The regular expression for on
is applied to every element in the dbPath
value. Typically, there's only one element in the dbPath
array. In some cases, there may be multiple elements, so be careful when specifying the on
value.
Request:
Item | v4 APIs |
---|---|
API Endpoint | /v4/deployments/4/valueImportRules |
Method |
|
Request Body:
[ {"type":"dbTableName","on":"/dev_([a-zA-Z0-9_]*)/","with":"prod_$1"}, {"type":"dbPath","on":"/dev_([a-zA-Z0-9_]*)_src/","with":"prod_$1_out"} ]
Response:
Item | v4 APIs |
---|---|
Status Code - Success | 200 - OK |
Response Body | When the new rules are applied, all previously existing rules for the object in the deployment are deleted. The response body contains any rules that have been deleted as part of this request. In the following example, there were no rules, so nothing was deleted: { "deleted": { "data": [] } } |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/updateValueImportRules |
To test your rule, perform a dry run of the import. See below.
Import Dry-Run
After you have specified a set of import rules, you can perform a dry-run of an import of an import package. This dry-run does not perform the actual import but does report any permissions errors or other issues in the response.
In this example, the flow2import.zip
file contains the package to import into deployment 4
.
Request:
Item | v4 APIs | ||||
---|---|---|---|---|---|
API Endpoint | /v4/deployments/4/releases/dryRun | ||||
Method |
| ||||
Request Body | In form data submitted with the request, you must include the following key-value pair:
|
Response:
Item | v4 APIs |
---|---|
Status Code - Success | 200 - OK |
Response Body | The response body contains any import remapping rules that have been applied during the import process. |
Documentation | See https://api.trifacta.com/ee/9.7/index.html#operation/importPackageForDeploymentDryRun |
After the above dry-run has been executed, the import package can be imported and is automatically connected to the appropriate connection. See https://api.trifacta.com/ee/9.7/index.html#operation/importPackageForDeployment