These topics cover how to operationalize your for use in your enterprise data pipelines.
Operationalization includes all tasks that occur after principle flow development has completed. After you have your flow working and delivering useful data, operationalization can include:
- Scheduling. For more information, see Schedule a Job.
- SQL scripting. Before or after job execution, you can configure your job to execute a specified SQL script. For more information, see Create Output SQL Scripts.
- Macros. You can build reusable sets of recipe steps, which can be used in other flows and workspaces for consistency. See Create or Replace Macro.
- Flow Webhooks. After a flow has executed, you can configure a webhook task to deliver messages about the flow execution to other systems. See below.
- Plans. A plan is a sequence of tasks that can be executed based on logic and can be scheduled. See below.