Why caching reduces server load in Rest API - Performance Analysis
We want to see how caching affects the work a server does when handling requests.
How does caching change the number of steps the server takes as requests grow?
Analyze the time complexity of the following code snippet.
// Pseudocode for handling API requests with caching
function handleRequest(request) {
if (cache.has(request.key)) {
return cache.get(request.key); // Fast return from cache
} else {
let data = fetchFromDatabase(request.key); // Slow database call
cache.set(request.key, data); // Store in cache
return data;
}
}
This code checks if the requested data is in cache. If yes, it returns quickly. If not, it fetches from the database and saves it in cache.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Checking cache and possibly fetching from database.
- How many times: Once per request, repeated for each incoming request.
As the number of requests grows, the server work depends on cache hits or misses.
| Input Size (requests) | Approx. Operations |
|---|---|
| 10 | Mostly quick cache checks, few database fetches |
| 100 | Many cache checks, some database fetches |
| 1000 | Mostly cache hits, very few database fetches |
Pattern observation: With caching, most requests are fast checks, so work grows slowly even as requests increase.
Time Complexity: O(1)
This means each request takes about the same small amount of time, so the server handles more requests efficiently.
[X] Wrong: "Caching makes the server do less work only sometimes, so it doesn't really help much."
[OK] Correct: Actually, caching means most requests skip slow database calls, so the server workload stays low even with many requests.
Understanding how caching reduces repeated work shows you can improve server speed and handle more users smoothly.
"What if the cache size is very small and often clears? How would the time complexity change?"