Consider a cache with a capacity of 3 items using the Least Recently Used (LRU) eviction policy. The cache is initially empty. The following keys are accessed in order: 1, 2, 3, 1, 4.
After these accesses, which keys remain in the cache?
Remember that LRU removes the least recently used item when the cache is full.
Initially, keys 1, 2, 3 fill the cache. Accessing key 1 makes it most recently used. When key 4 is accessed, the least recently used key (2) is evicted. Remaining keys are 1, 3, and 4.
You are designing a caching layer for a read-heavy web application where some items are accessed very frequently, and others rarely. Which cache eviction policy is most suitable to maximize cache hit rate?
Consider which policy favors items accessed many times over time.
LFU keeps items accessed most frequently, which suits read-heavy workloads with hot items. LRU favors recent access but may evict frequently accessed older items. TTL evicts based on time, not usage frequency.
You have a distributed cache system with TTL-based eviction. What is a major challenge when scaling this system across multiple nodes?
Think about time synchronization and expiration consistency.
In distributed caches with TTL, nodes must agree on expiration times. Clock skew or delays can cause inconsistent evictions, leading to stale or missing data. LRU and LFU challenges differ and are unrelated to TTL expiration.
Which statement best describes a key tradeoff between LRU and LFU cache eviction policies?
Consider how each policy responds to changes in item popularity over time.
LRU evicts least recently used items, adapting quickly if access patterns change. LFU evicts least frequently used items, which can cause it to keep items that were popular in the past but are no longer accessed.
A cache stores items with a TTL of 10 minutes. On average, 1000 new items are added per minute. Assuming steady state and no evictions other than TTL expiration, approximately how many items will the cache hold?
Think about how many items accumulate over the TTL duration.
With 1000 items added per minute and a TTL of 10 minutes, the cache holds items added in the last 10 minutes: 1000 * 10 = 10,000 items.