Expiration-based caching in Rest API - Time & Space Complexity
When using expiration-based caching in a REST API, we want to know how the time to handle requests changes as more data is cached or expires.
We ask: How does the caching process affect the speed of serving requests as cache size grows?
Analyze the time complexity of the following code snippet.
cache = {}
function getData(key) {
if (key in cache) {
if (cache[key].expiry > currentTime()) {
return cache[key].value
} else {
delete cache[key]
}
}
let data = fetchFromDatabase(key)
cache[key] = { value: data, expiry: currentTime() + 300 }
return data
}
This code checks if data is in cache and not expired. If expired or missing, it fetches fresh data and updates the cache.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Checking if the key exists and if the cached item is expired.
- How many times: This check happens once per request.
As the number of cached items grows, each request still checks only one key.
| Input Size (number of cached keys) | Approx. Operations per Request |
|---|---|
| 10 | 1 key check + possible fetch |
| 100 | 1 key check + possible fetch |
| 1000 | 1 key check + possible fetch |
Pattern observation: The time to check cache does not grow with cache size because it uses direct key lookup.
Time Complexity: O(1)
This means each request takes about the same time to check the cache, no matter how many items are stored.
[X] Wrong: "Checking if a cached item expired takes longer as the cache grows."
[OK] Correct: The cache uses direct key lookup, so expiration check is done only on the requested item, not all cached items.
Understanding how caching affects request time helps you design fast APIs and shows you can think about performance clearly.
"What if the cache used a list instead of a key-value store? How would the time complexity change?"