0
0
HLDsystem_design~20 mins

Why caching reduces latency in HLD - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Caching Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
How does caching reduce latency in a web application?

Imagine a web application that fetches user profile data from a database. Which of the following best explains why adding a cache layer reduces the time it takes to get the data?

ACache duplicates the database, so the system uses more storage but does not affect speed.
BCache stores frequently accessed data closer to the user, so requests avoid slow database queries.
CCache compresses data, making the database queries faster by reducing data size.
DCache delays requests to the database, batching them to reduce load but increasing latency.
Attempts:
2 left
💡 Hint

Think about where the data is stored and how fast it can be accessed compared to the original source.

Architecture
intermediate
2:00remaining
Where to place cache to reduce latency effectively?

In a typical client-server system, where is the best place to add a cache to reduce latency for repeated data requests?

AOn the server's hard disk, to store data permanently for faster access.
BOnly inside the database server, so queries run faster internally.
CBetween the client and server, so the client can get data without contacting the server every time.
DOn the network routers, to speed up data packets.
Attempts:
2 left
💡 Hint

Consider where repeated requests happen and where storing data temporarily can save time.

scaling
advanced
2:00remaining
Impact of cache size on latency reduction

A system uses caching to reduce latency. What happens if the cache size is too small compared to the data requested?

ACache hits increase, so latency decreases further.
BCache automatically expands to fit all data, so latency stays low.
CCache size does not affect latency as long as caching is enabled.
DCache misses increase, causing more requests to go to the slower database, increasing latency.
Attempts:
2 left
💡 Hint

Think about what happens when the cache cannot hold all the data needed.

tradeoff
advanced
2:00remaining
Tradeoff between cache freshness and latency

Why might a system choose to serve slightly stale data from cache instead of always fetching fresh data from the database?

AServing stale data reduces latency because it avoids slower database queries, improving user experience.
BServing stale data requires more network bandwidth, increasing latency.
CServing fresh data always reduces latency because it avoids cache overhead.
DServing stale data increases latency because the system must check timestamps before responding.
Attempts:
2 left
💡 Hint

Consider the speed difference between cache and database and the impact of data freshness.

estimation
expert
2:00remaining
Estimating latency reduction from caching

A database query takes 200 ms on average. A cache lookup takes 5 ms. If 80% of requests hit the cache, what is the average latency per request?

A45 ms
B50 ms
C40 ms
D60 ms
Attempts:
2 left
💡 Hint

Calculate weighted average latency based on cache hit and miss rates.