0
0
HLDsystem_design~12 mins

Storage access patterns in HLD - Architecture Diagram

Choose your learning style9 modes available
System Overview - Storage access patterns

This system demonstrates common storage access patterns used in software architectures. It highlights how data is accessed efficiently using caches, databases, and asynchronous queues to optimize performance and scalability.

Architecture Diagram
User
  |
  v
Load Balancer
  |
  v
API Gateway
  |
  +-------------------+
  |                   |
  v                   v
Cache               Service
  |                   |
  v                   v
Database <-------- Message Queue
Components
User
client
Initiates requests to the system
Load Balancer
load_balancer
Distributes incoming requests evenly across API Gateway instances
API Gateway
api_gateway
Handles client requests, routes them to services, and manages authentication
Cache
cache
Stores frequently accessed data to reduce database load and latency
Service
service
Processes business logic and interacts with storage components
Database
database
Stores persistent data for the system
Message Queue
queue
Handles asynchronous tasks and decouples service operations
Request Flow - 12 Hops
UserLoad Balancer
Load BalancerAPI Gateway
API GatewayCache
CacheAPI Gateway
API GatewayService
ServiceDatabase
ServiceCache
ServiceMessage Queue
Message QueueService
ServiceAPI Gateway
API GatewayLoad Balancer
Load BalancerUser
Failure Scenario
Component Fails:Cache
Impact:Cache misses increase, causing more direct database queries and higher latency
Mitigation:System continues to operate by querying the database; cache can be rebuilt asynchronously
Architecture Quiz - 3 Questions
Test your understanding
Which component is responsible for distributing incoming user requests evenly?
AAPI Gateway
BLoad Balancer
CCache
DMessage Queue
Design Principle
This architecture demonstrates efficient storage access by using a cache to reduce database load, an API Gateway to centralize request handling, and a message queue to decouple and asynchronously process background tasks, improving scalability and responsiveness.