0
0
HLDsystem_design~20 mins

Cache eviction policies (LRU, LFU, TTL) in HLD - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Cache Eviction Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Understanding LRU Cache Behavior

Consider a cache with a capacity of 3 items using the Least Recently Used (LRU) eviction policy. The cache is initially empty. The following keys are accessed in order: 1, 2, 3, 1, 4.

After these accesses, which keys remain in the cache?

A[2, 3, 4]
B[1, 3, 4]
C[1, 2, 4]
D[2, 3, 1]
Attempts:
2 left
💡 Hint

Remember that LRU removes the least recently used item when the cache is full.

Architecture
intermediate
2:00remaining
Choosing Cache Eviction Policy for Read-Heavy Workload

You are designing a caching layer for a read-heavy web application where some items are accessed very frequently, and others rarely. Which cache eviction policy is most suitable to maximize cache hit rate?

ALeast Frequently Used (LFU)
BRandom eviction
CTime To Live (TTL) based eviction
DLeast Recently Used (LRU)
Attempts:
2 left
💡 Hint

Consider which policy favors items accessed many times over time.

scaling
advanced
2:00remaining
Scaling Cache with TTL Eviction in Distributed Systems

You have a distributed cache system with TTL-based eviction. What is a major challenge when scaling this system across multiple nodes?

ARandomly evicting items to balance load
BHandling cache misses due to LRU eviction
CTracking frequency counts for LFU eviction
DEnsuring consistent TTL expiration times across nodes
Attempts:
2 left
💡 Hint

Think about time synchronization and expiration consistency.

tradeoff
advanced
2:00remaining
Tradeoffs Between LRU and LFU Cache Policies

Which statement best describes a key tradeoff between LRU and LFU cache eviction policies?

ALRU requires tracking access frequency, LFU only tracks recency
BLFU is simpler to implement than LRU and uses less memory
CLRU adapts quickly to changing access patterns, while LFU may retain stale popular items longer
DBoth LRU and LFU evict items randomly when full
Attempts:
2 left
💡 Hint

Consider how each policy responds to changes in item popularity over time.

estimation
expert
2:00remaining
Estimating Cache Size for TTL Eviction

A cache stores items with a TTL of 10 minutes. On average, 1000 new items are added per minute. Assuming steady state and no evictions other than TTL expiration, approximately how many items will the cache hold?

A10,000 items
B1000 items
C100 items
D1,000,000 items
Attempts:
2 left
💡 Hint

Think about how many items accumulate over the TTL duration.