Cache management in GraphQL - Time & Space Complexity
When using cache in GraphQL, we want to know how the time to get data changes as the data grows.
We ask: How does fetching from cache affect the speed when more data is stored?
Analyze the time complexity of the following GraphQL cache query snippet.
query GetUserData($id: ID!) {
user(id: $id) @cache(key: $id) {
id
name
posts {
id
title
}
}
}
This query fetches a user by ID using a cache key, then retrieves their posts.
Look for repeated actions that affect time.
- Primary operation: Searching the cache for the user data by key.
- How many times: Once per query, but inside, fetching posts loops over each post.
As the number of cached users grows, finding one user depends on the cache structure.
| Input Size (n users) | Approx. Operations |
|---|---|
| 10 | About 10 lookups if cache is simple |
| 100 | About 100 lookups if cache is simple |
| 1000 | About 1000 lookups if cache is simple |
Pattern observation: If cache is a list, time grows linearly with users. If cache uses a fast lookup (like a map), time stays almost the same.
Time Complexity: O(1)
This means fetching cached data by key takes about the same time no matter how many users are cached.
[X] Wrong: "Cache always makes queries faster regardless of how it is built."
[OK] Correct: If the cache is a simple list, searching can still take longer as data grows. The cache structure matters.
Understanding how cache lookup time changes with data size shows you know how to keep apps fast and scalable.
What if the cache was stored as a list instead of a map? How would the time complexity change?