Proxy cache key in Nginx - Time & Space Complexity
We want to understand how the time to find or store a cached response grows as more requests come in.
How does the proxy cache key affect the speed of caching operations?
Analyze the time complexity of this nginx proxy cache key configuration.
proxy_cache_key "$scheme://$host$request_uri";
proxy_cache my_cache;
proxy_cache_valid 200 10m;
proxy_pass http://backend;
This code sets the cache key to the full URL and uses it to store and retrieve cached responses.
Look at what repeats when handling many requests.
- Primary operation: Hashing the cache key and checking the corresponding cache entry.
- How many times: Once per incoming request.
As the number of cached items grows, finding the right cache entry takes constant work thanks to hashing.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 1 hash + check |
| 100 | About 1 hash + check |
| 1000 | About 1 hash + check |
Pattern observation: The number of operations stays roughly constant regardless of the number of cached items.
Time Complexity: O(1)
This means the time to find a cached response stays constant as the cache size grows.
[X] Wrong: "The cache lookup time grows linearly no matter what."
[OK] Correct: Nginx proxy cache hashes the key to compute a direct storage location (e.g., file path), enabling average O(1) lookups.
Understanding how cache keys affect lookup time helps you explain how caching impacts performance in real systems.
"What if the cache used a hash table for keys? How would the time complexity change?"