0
0
Data Structures Theoryknowledge~5 mins

LRU cache design with hash map and doubly linked list in Data Structures Theory - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: LRU cache design with hash map and doubly linked list
O(1)
Understanding Time Complexity

Analyzing time complexity helps us understand how fast the LRU cache operations run as the cache size grows.

We want to know how the time to get or put items changes when the cache holds more entries.

Scenario Under Consideration

Analyze the time complexity of the following LRU cache operations.

class Node:
    def __init__(self, key, value):
        self.key = key
        self.value = value
        self.prev = None
        self.next = None

class LRUCache:
    def __init__(self, capacity):
        self.cache = {}  # key to node
        self.capacity = capacity
        self.head = Node(0, 0)  # dummy head
        self.tail = Node(0, 0)  # dummy tail
        self.head.next = self.tail
        self.tail.prev = self.head

    def get(self, key):
        if key in self.cache:
            node = self.cache[key]
            self._remove(node)
            self._add(node)
            return node.value
        return -1

    def put(self, key, value):
        if key in self.cache:
            self._remove(self.cache[key])
        node = Node(key, value)
        self._add(node)
        self.cache[key] = node
        if len(self.cache) > self.capacity:
            lru = self.head.next
            self._remove(lru)
            del self.cache[lru.key]

    def _remove(self, node):
        prev, nxt = node.prev, node.next
        prev.next = nxt
        nxt.prev = prev

    def _add(self, node):
        prev, nxt = self.tail.prev, self.tail
        prev.next = node
        node.prev = prev
        node.next = nxt
        nxt.prev = node

This code keeps track of recently used items with a hash map and a doubly linked list to quickly update usage order.

Identify Repeating Operations

Look at the main operations that repeat when we get or put items.

  • Primary operation: Accessing the hash map to find nodes and updating the doubly linked list pointers.
  • How many times: Each get or put does a fixed number of these steps, regardless of cache size.
How Execution Grows With Input

Each operation touches only a few nodes and hash map entries, no matter how big the cache is.

Input Size (cache capacity)Approx. Operations per get/put
10About 5-10 pointer and hash map steps
100Still about 5-10 steps
1000Still about 5-10 steps

Pattern observation: The number of steps stays roughly the same no matter how many items the cache holds.

Final Time Complexity

Time Complexity: O(1)

This means each get or put operation takes the same short time, no matter how many items are stored.

Common Mistake

[X] Wrong: "Since the cache stores many items, get or put must take longer as it grows."

[OK] Correct: The hash map lets us find items instantly, and the doubly linked list lets us update usage order quickly, so operations stay fast.

Interview Connect

Understanding this design shows you can combine data structures to keep operations efficient, a key skill in real-world coding.

Self-Check

"What if we replaced the doubly linked list with a singly linked list? How would the time complexity change?"