0
0
HLDsystem_design~7 mins

Cache eviction policies (LRU, LFU, TTL) in HLD - System Design Guide

Choose your learning style9 modes available
Problem Statement
Caches have limited space, so when they fill up, old data must be removed to make room for new data. Without a smart way to decide which data to remove, the cache may evict useful data, causing more slow database or API calls and hurting performance.
Solution
Cache eviction policies decide which cached items to remove when space is needed. LRU removes the least recently used items, assuming recent use means importance. LFU removes the least frequently used items, assuming frequent use means importance. TTL removes items after a set time, ensuring data freshness.
Architecture
┌─────────────┐        ┌─────────────┐        ┌─────────────┐
│   Client    │  ───→  │   Cache     │  ───→  │   Database  │
└─────────────┘        └─────────────┘        └─────────────┘
        ↑                    │  ↑  │
        │                    │  │  │
        │                    │  │  └─ Eviction based on policy
        │                    │  └───── LRU / LFU / TTL
        └────────────────────┘

This diagram shows a client requesting data from a cache. The cache serves data if present or fetches from the database if missing. When the cache is full, it evicts items based on LRU, LFU, or TTL policies.

Trade-offs
✓ Pros
LRU is simple and adapts well to recent usage patterns.
LFU captures long-term popularity of items, reducing eviction of frequently accessed data.
TTL ensures data freshness by removing stale entries automatically.
✗ Cons
LRU can evict frequently used items if they are not recently accessed.
LFU requires tracking usage counts, increasing memory and computation overhead.
TTL may evict still useful data prematurely if the time is too short.
Use LRU when recent access is a good indicator of future use and cache size is moderate. Use LFU when long-term frequency matters and you can afford extra tracking. Use TTL when data freshness is critical and stale data must be removed after a fixed time.
Avoid LRU if access patterns are highly irregular or bursty, causing frequent evictions. Avoid LFU if tracking overhead is too costly or usage patterns change rapidly. Avoid TTL if data validity varies widely or precise freshness is not required.
Real World Examples
Amazon
Amazon uses LRU caching in their product recommendation system to keep recently viewed items quickly accessible.
Netflix
Netflix uses LFU caching in their content delivery to prioritize frequently watched shows and movies.
Twitter
Twitter applies TTL caching for user session tokens to ensure they expire after a set time for security.
Alternatives
Random Eviction
Evicts a random cache item instead of based on usage or time.
Use when: Use when simplicity is more important than hit rate and cache size is small.
Most Recently Used (MRU)
Evicts the most recently used items, opposite of LRU.
Use when: Use when older data is more valuable than recent data, such as in certain streaming scenarios.
Summary
Cache eviction policies decide which data to remove when cache space runs out.
LRU removes least recently used, LFU removes least frequently used, and TTL removes data after a set time.
Choosing the right policy depends on access patterns, data freshness needs, and system constraints.