Cache warming strategies in Redis - Time & Space Complexity
When we warm a cache, we load data into it before it is needed. This helps speed up future requests.
We want to know how the time to warm the cache grows as the amount of data increases.
Analyze the time complexity of the following Redis cache warming code.
for i, key in ipairs(keys) do
local value = redis.call('GET', key)
if not value then
value = fetch_from_db(key)
redis.call('SET', key, value)
end
end
This code checks each key in a list. If the key is not in Redis, it fetches the data from the database and stores it in Redis.
- Primary operation: Looping over each key to check and set cache entries.
- How many times: Once for each key in the list.
As the number of keys grows, the code does more work because it checks each key one by one.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 checks and possible sets |
| 100 | About 100 checks and possible sets |
| 1000 | About 1000 checks and possible sets |
Pattern observation: The work grows directly with the number of keys; doubling keys doubles the work.
Time Complexity: O(n)
This means the time to warm the cache grows in a straight line with the number of keys.
[X] Wrong: "Cache warming is instant no matter how many keys we have."
[OK] Correct: Each key requires a check and possibly a fetch and set, so more keys mean more time.
Understanding how cache warming scales helps you design systems that stay fast as data grows. This skill shows you can think about real-world performance.
"What if we batch multiple keys in one Redis call instead of checking them one by one? How would the time complexity change?"