0
0
HLDsystem_design~12 mins

Cache stampede prevention in HLD - Architecture Diagram

Choose your learning style9 modes available
System Overview - Cache stampede prevention

This system prevents cache stampede, a problem where many users request the same data simultaneously when the cache expires, causing a heavy load on the database. The system uses techniques like request locking and early cache refresh to keep the cache healthy and reduce database overload.

Architecture Diagram
User
  |
  v
Load Balancer
  |
  v
API Gateway
  |
  v
Cache Layer <--> Request Lock Manager
  |               |
  v               v
Database         Background Refresher
  
Components
User
client
Sends requests for data
Load Balancer
load_balancer
Distributes incoming requests evenly to API Gateway instances
API Gateway
api_gateway
Receives requests, checks cache, and manages request locking
Cache Layer
cache
Stores frequently requested data to reduce database load
Request Lock Manager
lock_manager
Prevents multiple requests from querying the database simultaneously for the same data
Database
database
Stores the authoritative data
Background Refresher
service
Proactively refreshes cache before expiration to avoid stampede
Request Flow - 12 Hops
UserLoad Balancer
Load BalancerAPI Gateway
API GatewayCache Layer
Cache LayerAPI Gateway
API GatewayRequest Lock Manager
Request Lock ManagerAPI Gateway
API GatewayDatabase
DatabaseAPI Gateway
API GatewayCache Layer
API GatewayRequest Lock Manager
API GatewayUser
Background RefresherCache Layer
Failure Scenario
Component Fails:Request Lock Manager
Impact:Multiple requests query the database simultaneously on cache miss, causing high DB load and possible overload.
Mitigation:Implement distributed locking with high availability or fallback to rate limiting to reduce DB pressure.
Architecture Quiz - 3 Questions
Test your understanding
What component prevents multiple requests from querying the database at the same time for the same data?
ARequest Lock Manager
BLoad Balancer
CBackground Refresher
DAPI Gateway
Design Principle
This architecture uses locking and proactive cache refresh to prevent cache stampede, ensuring that only one request queries the database on cache miss and that cache is refreshed early to reduce load spikes.