Validation-based caching in Rest API - Time & Space Complexity
When using validation-based caching in REST APIs, we want to know how the time to check and serve cached data changes as requests grow.
We ask: How does the caching validation affect the speed when many requests come in?
Analyze the time complexity of the following code snippet.
// Pseudocode for validation-based caching
GET /resource
if cache exists for resource:
if cache is still valid:
return cached response
else:
fetch fresh data
update cache
return fresh response
else:
fetch fresh data
store in cache
return fresh response
This code checks if cached data is valid before deciding to fetch fresh data or return cached data.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Checking cache validity and fetching fresh data if needed.
- How many times: Once per request, repeated for each incoming API call.
As the number of requests grows, each request does a quick cache check and sometimes fetches fresh data.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 cache checks, some fresh fetches |
| 100 | 100 cache checks, some fresh fetches |
| 1000 | 1000 cache checks, some fresh fetches |
Pattern observation: The number of operations grows roughly in direct proportion to the number of requests.
Time Complexity: O(n)
This means the time to handle requests grows linearly with the number of requests, since each request does a quick validation check.
[X] Wrong: "Cache validation makes the response time constant no matter how many requests come."
[OK] Correct: Each request still needs to check the cache, so the total work grows with the number of requests, even if each check is fast.
Understanding how caching affects time complexity helps you explain how APIs stay fast under load, a useful skill for real-world backend work.
"What if the cache validation involved a slow database call? How would the time complexity change?"