0
0
CsharpProgramIntermediate · 3 min read

C# Program to Implement LRU Cache Efficiently

You can implement an LRU cache in C# by combining a Dictionary for fast lookups and a LinkedList to track usage order, updating the list on each access and removing the least recently used item when capacity is exceeded.
📋

Examples

InputPut(1, 1), Put(2, 2), Get(1), Put(3, 3), Get(2)
Output1, -1
InputPut(1, 10), Put(2, 20), Put(3, 30), Get(1), Put(4, 40), Get(3)
Output10, -1
InputGet(5)
Output-1
🧠

How to Think About It

To build an LRU cache, use a dictionary to store key-value pairs for quick access and a linked list to keep track of the order in which keys are used. When a key is accessed or added, move it to the front of the list to mark it as recently used. If the cache exceeds its capacity, remove the key at the end of the list, which is the least recently used.
📐

Algorithm

1
Initialize a dictionary for key-node pairs and a linked list to track usage order.
2
For a Get operation, check if the key exists; if yes, move its node to the front and return its value; otherwise, return -1.
3
For a Put operation, if the key exists, update its value and move its node to the front.
4
If the key does not exist, add it to the dictionary and linked list front.
5
If adding a new key exceeds capacity, remove the last node from the linked list and delete its key from the dictionary.
💻

Code

csharp
using System;
using System.Collections.Generic;

public class LRUCache {
    private int capacity;
    private Dictionary<int, LinkedListNode<(int key, int value)>> cacheMap;
    private LinkedList<(int key, int value)> lruList;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        cacheMap = new Dictionary<int, LinkedListNode<(int key, int value)>>();
        lruList = new LinkedList<(int key, int value)>();
    }

    public int Get(int key) {
        if (!cacheMap.ContainsKey(key)) return -1;
        var node = cacheMap[key];
        lruList.Remove(node);
        lruList.AddFirst(node);
        return node.Value.value;
    }

    public void Put(int key, int value) {
        if (cacheMap.ContainsKey(key)) {
            var node = cacheMap[key];
            lruList.Remove(node);
        } else if (cacheMap.Count == capacity) {
            var lastNode = lruList.Last;
            cacheMap.Remove(lastNode.Value.key);
            lruList.RemoveLast();
        }
        lruList.AddFirst((key, value));
        cacheMap[key] = lruList.First;
    }
}

class Program {
    static void Main() {
        var cache = new LRUCache(2);
        cache.Put(1, 1);
        cache.Put(2, 2);
        Console.WriteLine(cache.Get(1)); // 1
        cache.Put(3, 3);
        Console.WriteLine(cache.Get(2)); // -1
    }
}
Output
1 -1
🔍

Dry Run

Let's trace the example where we put keys 1 and 2, get key 1, put key 3, and get key 2 through the code.

1

Put(1,1)

Cache is empty. Add (1,1) to front. Cache: {(1,1)}

2

Put(2,2)

Add (2,2) to front. Cache order: (2,2) -> (1,1)

3

Get(1)

Key 1 found. Move (1,1) to front. Cache order: (1,1) -> (2,2). Return 1.

4

Put(3,3)

Cache full. Remove least used (2,2). Add (3,3) to front. Cache order: (3,3) -> (1,1)

5

Get(2)

Key 2 not found. Return -1.

OperationCache State (Most Recent to Least)Returned Value
Put(1,1)(1,1)
Put(2,2)(2,2) -> (1,1)
Get(1)(1,1) -> (2,2)1
Put(3,3)(3,3) -> (1,1)
Get(2)(3,3) -> (1,1)-1
💡

Why This Works

Step 1: Dictionary for Fast Lookup

Using a Dictionary lets us find if a key exists and get its node in constant time.

Step 2: LinkedList for Usage Order

A LinkedList keeps keys in order from most recently used at the front to least recently used at the end.

Step 3: Updating Usage on Access

When a key is accessed or added, we move its node to the front to mark it as recently used.

Step 4: Evicting Least Recently Used

If capacity is exceeded, we remove the node at the end of the list and delete its key from the dictionary.

🔄

Alternative Approaches

Using OrderedDictionary
csharp
using System.Collections.Specialized;
using System;

public class LRUCache {
    private int capacity;
    private OrderedDictionary cache;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        cache = new OrderedDictionary();
    }

    public int Get(int key) {
        if (!cache.Contains(key)) return -1;
        var value = (int)cache[key];
        cache.Remove(key);
        cache.Insert(0, key, value);
        return value;
    }

    public void Put(int key, int value) {
        if (cache.Contains(key)) {
            cache.Remove(key);
        } else if (cache.Count == capacity) {
            cache.RemoveAt(cache.Count - 1);
        }
        cache.Insert(0, key, value);
    }
}
Simpler code but <code>OrderedDictionary</code> is less efficient and not generic.
Using Custom Doubly Linked List and Dictionary
csharp
/* Similar to main approach but with explicit node class for clarity and control. */
More code but clearer control over nodes and pointers.

Complexity: O(1) time, O(capacity) space

Time Complexity

Both Get and Put operations run in constant time because dictionary lookups and linked list node moves are O(1).

Space Complexity

The cache stores up to capacity items in the dictionary and linked list, so space is O(capacity).

Which Approach is Fastest?

Using a dictionary with a linked list is fastest and most memory efficient compared to alternatives like OrderedDictionary.

ApproachTimeSpaceBest For
Dictionary + LinkedListO(1)O(capacity)High performance, large caches
OrderedDictionaryO(n) for some opsO(capacity)Simpler code, small caches
Custom Doubly Linked List + DictionaryO(1)O(capacity)Full control, educational purposes
💡
Use a dictionary plus a linked list to get O(1) time for both get and put operations in an LRU cache.
⚠️
Forgetting to move accessed nodes to the front, which breaks the LRU order.