Long-running operations (async responses) in Rest API - Time & Space Complexity
When dealing with long-running operations in APIs, it's important to understand how the time to complete grows as the work increases.
We want to know how waiting for async responses affects the total time as input size changes.
Analyze the time complexity of the following code snippet.
async function fetchData(ids) {
const results = [];
for (const id of ids) {
const response = await fetch(`/api/data/${id}`);
const data = await response.json();
results.push(data);
}
return results;
}
This code fetches data for each id one after another, waiting for each response before starting the next.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Fetching data from the API for each id.
- How many times: Once per id in the input list, sequentially.
Each id causes a separate fetch call that waits for a response before moving on.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 fetch calls, done one after another |
| 100 | 100 fetch calls, each waiting for the previous |
| 1000 | 1000 fetch calls, total time grows with each call |
Pattern observation: Total time grows roughly in direct proportion to the number of ids because calls happen one after another.
Time Complexity: O(n)
This means the total time grows linearly with the number of requests because each waits for the previous to finish.
[X] Wrong: "Since the calls are async, all requests happen at the same time, so time stays constant regardless of input size."
[OK] Correct: In this code, each fetch waits for the previous one to finish before starting, so calls happen one after another, not in parallel.
Understanding how async calls affect total time helps you design APIs and clients that handle many requests efficiently and shows you can reason about real-world performance.
What if we changed the code to start all fetch calls at once and then wait for all to finish? How would the time complexity change?