C# Program to Implement LRU Cache Efficiently
Dictionary for fast lookups and a LinkedList to track usage order, updating the list on each access and removing the least recently used item when capacity is exceeded.Examples
How to Think About It
Algorithm
Code
using System; using System.Collections.Generic; public class LRUCache { private int capacity; private Dictionary<int, LinkedListNode<(int key, int value)>> cacheMap; private LinkedList<(int key, int value)> lruList; public LRUCache(int capacity) { this.capacity = capacity; cacheMap = new Dictionary<int, LinkedListNode<(int key, int value)>>(); lruList = new LinkedList<(int key, int value)>(); } public int Get(int key) { if (!cacheMap.ContainsKey(key)) return -1; var node = cacheMap[key]; lruList.Remove(node); lruList.AddFirst(node); return node.Value.value; } public void Put(int key, int value) { if (cacheMap.ContainsKey(key)) { var node = cacheMap[key]; lruList.Remove(node); } else if (cacheMap.Count == capacity) { var lastNode = lruList.Last; cacheMap.Remove(lastNode.Value.key); lruList.RemoveLast(); } lruList.AddFirst((key, value)); cacheMap[key] = lruList.First; } } class Program { static void Main() { var cache = new LRUCache(2); cache.Put(1, 1); cache.Put(2, 2); Console.WriteLine(cache.Get(1)); // 1 cache.Put(3, 3); Console.WriteLine(cache.Get(2)); // -1 } }
Dry Run
Let's trace the example where we put keys 1 and 2, get key 1, put key 3, and get key 2 through the code.
Put(1,1)
Cache is empty. Add (1,1) to front. Cache: {(1,1)}
Put(2,2)
Add (2,2) to front. Cache order: (2,2) -> (1,1)
Get(1)
Key 1 found. Move (1,1) to front. Cache order: (1,1) -> (2,2). Return 1.
Put(3,3)
Cache full. Remove least used (2,2). Add (3,3) to front. Cache order: (3,3) -> (1,1)
Get(2)
Key 2 not found. Return -1.
| Operation | Cache State (Most Recent to Least) | Returned Value |
|---|---|---|
| Put(1,1) | (1,1) | |
| Put(2,2) | (2,2) -> (1,1) | |
| Get(1) | (1,1) -> (2,2) | 1 |
| Put(3,3) | (3,3) -> (1,1) | |
| Get(2) | (3,3) -> (1,1) | -1 |
Why This Works
Step 1: Dictionary for Fast Lookup
Using a Dictionary lets us find if a key exists and get its node in constant time.
Step 2: LinkedList for Usage Order
A LinkedList keeps keys in order from most recently used at the front to least recently used at the end.
Step 3: Updating Usage on Access
When a key is accessed or added, we move its node to the front to mark it as recently used.
Step 4: Evicting Least Recently Used
If capacity is exceeded, we remove the node at the end of the list and delete its key from the dictionary.
Alternative Approaches
using System.Collections.Specialized; using System; public class LRUCache { private int capacity; private OrderedDictionary cache; public LRUCache(int capacity) { this.capacity = capacity; cache = new OrderedDictionary(); } public int Get(int key) { if (!cache.Contains(key)) return -1; var value = (int)cache[key]; cache.Remove(key); cache.Insert(0, key, value); return value; } public void Put(int key, int value) { if (cache.Contains(key)) { cache.Remove(key); } else if (cache.Count == capacity) { cache.RemoveAt(cache.Count - 1); } cache.Insert(0, key, value); } }
/* Similar to main approach but with explicit node class for clarity and control. */Complexity: O(1) time, O(capacity) space
Time Complexity
Both Get and Put operations run in constant time because dictionary lookups and linked list node moves are O(1).
Space Complexity
The cache stores up to capacity items in the dictionary and linked list, so space is O(capacity).
Which Approach is Fastest?
Using a dictionary with a linked list is fastest and most memory efficient compared to alternatives like OrderedDictionary.
| Approach | Time | Space | Best For |
|---|---|---|---|
| Dictionary + LinkedList | O(1) | O(capacity) | High performance, large caches |
| OrderedDictionary | O(n) for some ops | O(capacity) | Simpler code, small caches |
| Custom Doubly Linked List + Dictionary | O(1) | O(capacity) | Full control, educational purposes |