0
0
Redisquery~5 mins

Cache-aside pattern in Redis - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: Cache-aside pattern
O(n)
Understanding Time Complexity

When using the cache-aside pattern, we want to know how the time to get data changes as the data size grows.

We ask: How does the number of steps grow when we check cache and then the database?

Scenario Under Consideration

Analyze the time complexity of the following Redis cache-aside code snippet.


// Try to get data from cache
GET user:123

// If cache miss, get from database and update cache
if not found {
  data = DB.GET("user:123")
  SET user:123 data
}

// Return data
return data
    

This code tries to get user data from Redis cache first. If missing, it fetches from the database and updates the cache.

Identify Repeating Operations

Identify the loops, recursion, array traversals that repeat.

  • Primary operation: Single key lookup in cache and possibly one database fetch.
  • How many times: Exactly once per data request.
How Execution Grows With Input

Each request checks the cache once and may check the database once if needed.

Input Size (n)Approx. Operations
1010 cache lookups, up to 10 database fetches
100100 cache lookups, up to 100 database fetches
10001000 cache lookups, up to 1000 database fetches

Pattern observation: The number of operations grows directly with the number of requests.

Final Time Complexity

Time Complexity: O(n)

This means the time to handle requests grows in a straight line as the number of requests increases.

Common Mistake

[X] Wrong: "Cache-aside pattern reduces time complexity to constant time regardless of requests."

[OK] Correct: Each request still requires at least one cache check, so time grows with requests, not fixed.

Interview Connect

Understanding how cache-aside scales helps you explain real-world data fetching strategies clearly and confidently.

Self-Check

"What if we batch multiple keys in one cache request? How would the time complexity change?"