In many scenarios, data pipelines have dependencies in them. Data sourced from one dataset must be cleaned and delivered for use in another data pipeline. When a pipeline is built, it needs to be scheduled for periodic execution. Ideally, when the execution completes, downstream stakeholders must be informed that the pipeline has succeeded or failed in execution.
In Designer Cloud, plans provide the mechanism for building these robust data pipelines within the application, in which sequences of flows can be executed to deliver more diverse datasets. As a result of the success or failure of these flow executions, the Designer Cloud application can deliver messages to receiving applications, such as Slack.
Tip: As one of your plan tasks, you can configure an HTTP message, which allows you to trigger endpoints in the Designer Cloud application itself. So, you can configure your plan executions to trigger a wide range of activities within the product itself.
In this example, you want to create a plan that executes the following tasks:
- Execute the flow task that performs the initial cleaning of your data. The results data is saved to a known location.
- Execute the flow task that transforms the cleaned data for downstream uses.
- Based on the outcome of step 2:
- On success: Send a success message to a Slack channel.
- On failure: Send a failure message to a Slack channel.
In Plan View, this plan might look like the following:
After creating the above plan, you can perform test runs.
If the tests are successful, you can schedule the plan for periodic execution.
Create a Plan
You build plans in Plan View. See Build a Plan.
Schedule a Plan
You can schedule executions of your plan. See Schedule a Plan Run.
This page has no comments.