Durable Functions for workflows in Azure - Time & Space Complexity
When using Durable Functions for workflows, it's important to understand how the number of steps affects the time it takes to complete the process.
We want to know how the workflow's execution time grows as we add more tasks.
Analyze the time complexity of this Durable Function orchestration.
[FunctionName("Orchestrator")]
public static async Task<string[]> RunOrchestrator(
[OrchestrationTrigger] IDurableOrchestrationContext context)
{
var tasks = new List<Task<string>>();
var inputs = context.GetInput<List<string>>();
foreach (var input in inputs)
{
tasks.Add(context.CallActivityAsync<string>("ActivityFunction", input));
}
var results = await Task.WhenAll(tasks);
return results;
}
This orchestration calls an activity function for each input item in parallel and waits for all to complete.
Look at what repeats as input grows:
- Primary operation: Calling the activity function for each input item.
- How many times: Once per input item, so as many times as the number of inputs.
Each new input adds one activity call that runs in parallel.
| Input Size (n) | Approx. API Calls/Operations |
|---|---|
| 10 | 10 activity calls |
| 100 | 100 activity calls |
| 1000 | 1000 activity calls |
Pattern observation: The number of calls grows directly with the number of inputs.
Time Complexity: O(n)
This means the total number of activity calls grows linearly as you add more inputs.
[X] Wrong: "Because activities run in parallel, adding more inputs won't increase total execution time."
[OK] Correct: While activities run at the same time, the orchestration still needs to start each one, so the number of calls grows with inputs, affecting resource use and orchestration overhead.
Understanding how workflows scale with input size shows you can design cloud processes that handle growth smoothly and predict resource needs well.
"What if the orchestration called activities one after another instead of in parallel? How would the time complexity change?"