0
0
HLDsystem_design~12 mins

Cache eviction policies (LRU, LFU, TTL) in HLD - Architecture Diagram

Choose your learning style9 modes available
System Overview - Cache eviction policies (LRU, LFU, TTL)

This system manages a cache to speed up data access by storing frequently used data temporarily. It uses eviction policies like LRU (Least Recently Used), LFU (Least Frequently Used), and TTL (Time To Live) to decide which data to remove when the cache is full or data expires.

Architecture Diagram
User
  |
  v
Load Balancer
  |
  v
API Gateway
  |
  v
Cache Manager
 /   |    \
LRU  LFU   TTL
  \   |    /
   Cache Storage
      |
      v
  Database
Components
User
client
Sends requests to access data
Load Balancer
load_balancer
Distributes incoming requests evenly to API Gateway
API Gateway
api_gateway
Receives requests and routes them to Cache Manager
Cache Manager
service
Handles cache logic and applies eviction policies
LRU
eviction_policy
Evicts least recently used items when cache is full
LFU
eviction_policy
Evicts least frequently used items when cache is full
TTL
eviction_policy
Evicts items after their time to live expires
Cache Storage
cache
Stores cached data for fast access
Database
database
Stores the original persistent data
Request Flow - 11 Hops
UserLoad Balancer
Load BalancerAPI Gateway
API GatewayCache Manager
Cache ManagerCache Storage
Cache StorageCache Manager
Cache ManagerDatabase
DatabaseCache Manager
Cache ManagerCache Storage
Cache ManagerAPI Gateway
API GatewayLoad Balancer
Load BalancerUser
Failure Scenario
Component Fails:Cache Storage
Impact:Cache misses increase, causing more direct database queries and higher latency
Mitigation:System continues to serve data from database; cache rebuilds as data is requested; consider cache replication or fallback cache
Architecture Quiz - 3 Questions
Test your understanding
Which eviction policy removes the data that was used least recently?
ALFU
BLRU
CTTL
DRandom Eviction
Design Principle
This design shows how different cache eviction policies help manage limited cache space by removing data based on usage patterns or expiration time, improving system performance and resource use.