How to Design Cache System LLD: Low-Level Design Guide
To design a cache system in
LLD, define cache storage, eviction policies, and data consistency methods. Use key-value storage with LRU or TTL eviction, and ensure synchronization with the main data source.Syntax
A cache system typically includes these parts:
- Cache Storage: Holds cached data as key-value pairs.
- Eviction Policy: Decides which data to remove when full (e.g., LRU, TTL).
- Get/Set Methods: Retrieve or store data in cache.
- Consistency: Keeps cache data synced with the main database.
javascript
class Cache { constructor(capacity) { this.capacity = capacity; this.cache = new Map(); // stores key-value pairs } get(key) { if (!this.cache.has(key)) return null; // Move key to end to mark as recently used const value = this.cache.get(key); this.cache.delete(key); this.cache.set(key, value); return value; } set(key, value) { if (this.cache.has(key)) { this.cache.delete(key); } else if (this.cache.size === this.capacity) { // Remove least recently used (first item) const firstKey = this.cache.keys().next().value; this.cache.delete(firstKey); } this.cache.set(key, value); } }
Example
This example shows a simple LRU Cache with fixed capacity. It stores key-value pairs and evicts the least recently used item when full.
javascript
const cache = new Cache(2); cache.set('a', 1); cache.set('b', 2); console.log(cache.get('a')); // Output: 1 cache.set('c', 3); // Evicts key 'b' console.log(cache.get('b')); // Output: null cache.set('d', 4); // Evicts key 'a' console.log(cache.get('a')); // Output: null console.log(cache.get('c')); // Output: 3 console.log(cache.get('d')); // Output: 4
Output
1
null
null
3
4
Common Pitfalls
Common mistakes when designing cache systems include:
- Not handling cache eviction properly, causing memory overflow.
- Ignoring cache consistency, leading to stale data.
- Using inefficient data structures causing slow lookups.
- Not considering thread safety in concurrent environments.
Always choose the right eviction policy and keep cache synchronized with the source.
javascript
/* Wrong: Using array for cache causes slow lookups */ class BadCache { constructor(capacity) { this.capacity = capacity; this.cache = []; } get(key) { for (const item of this.cache) { if (item.key === key) return item.value; } return null; } set(key, value) { if (this.cache.length >= this.capacity) { this.cache.shift(); // removes oldest but no LRU logic } this.cache.push({ key, value }); } } /* Right: Use Map for O(1) access and LRU eviction as shown in Syntax section */
Quick Reference
- Cache Storage: Use hash maps or dictionaries for fast access.
- Eviction Policies: LRU (Least Recently Used), LFU (Least Frequently Used), TTL (Time To Live).
- Consistency: Use write-through, write-back, or cache invalidation strategies.
- Concurrency: Use locks or atomic operations to avoid race conditions.
- Capacity: Set based on memory limits and expected load.
Key Takeaways
Use efficient data structures like Map for O(1) cache operations.
Implement eviction policies like LRU to manage limited cache size.
Keep cache data consistent with the main data source to avoid stale reads.
Consider concurrency and thread safety in multi-threaded environments.
Choose cache capacity based on system memory and performance needs.