DataLoader batching and caching in GraphQL - Time & Space Complexity
When using DataLoader in GraphQL, it helps group many requests into fewer ones and remembers past results.
We want to see how the time to get data changes as the number of requests grows.
Analyze the time complexity of the following code snippet.
const loader = new DataLoader(keys => batchLoadFunction(keys));
// Later in resolvers
const results = await Promise.all(
keys.map(key => loader.load(key))
);
This code batches multiple key requests into one batchLoadFunction call and caches results for reuse.
Look for repeated actions that affect performance.
- Primary operation: batchLoadFunction called once per batch with all keys.
- How many times: Once per batch, not once per key.
As more keys come in, DataLoader groups them to reduce calls.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 1 batch call with 10 keys |
| 100 | 1 batch call with 100 keys |
| 1000 | 1 batch call with 1000 keys |
Pattern observation: The number of batch calls stays the same (one), but each call handles more keys.
Time Complexity: O(n)
This means the time grows linearly with the number of keys because all keys are processed together in one batch.
[X] Wrong: "DataLoader makes each key load instantly, so time does not grow with more keys."
[OK] Correct: Even though DataLoader batches keys, the batchLoadFunction still processes all keys together, so time grows with the number of keys.
Understanding how batching and caching affect time helps you explain efficient data fetching in GraphQL APIs clearly and confidently.
What if the batchLoadFunction itself made multiple database calls instead of one? How would the time complexity change?