Why serverless matters in Azure - Performance Analysis
We want to understand how the time to run serverless functions changes as we add more tasks.
How does the number of function calls grow when the workload grows?
Analyze the time complexity of the following operation sequence.
// Azure Function triggered by HTTP request
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, ILogger log)
{
var tasks = new List<Task>();
for (int i = 0; i < n; i++)
{
tasks.Add(ProcessItemAsync(i));
}
await Task.WhenAll(tasks);
return req.CreateResponse(HttpStatusCode.OK);
}
private static async Task ProcessItemAsync(int itemId)
{
// Simulate processing
await Task.Delay(100);
}
This code runs a serverless function that processes n items in parallel by calling a helper function for each item.
Identify the API calls, resource provisioning, data transfers that repeat.
- Primary operation: Calling ProcessItemAsync for each item.
- How many times: Exactly n times, once per item.
Each new item adds one more function call to process it, so the total calls grow directly with n.
| Input Size (n) | Approx. Api Calls/Operations |
|---|---|
| 10 | 10 calls to ProcessItemAsync |
| 100 | 100 calls to ProcessItemAsync |
| 1000 | 1000 calls to ProcessItemAsync |
Pattern observation: The number of calls grows linearly as the input size increases.
Time Complexity: O(n)
This means the total work grows in direct proportion to the number of items to process.
[X] Wrong: "Serverless functions run instantly, so time does not grow with more items."
[OK] Correct: Each item still needs its own function call, so more items mean more calls and more total time.
Understanding how serverless scales with workload helps you design efficient cloud solutions and explain your reasoning clearly in interviews.
"What if we changed the code to process items one after another instead of in parallel? How would the time complexity change?"