0
0
JavaProgramIntermediate · 2 min read

Java Program to Implement LRU Cache Efficiently

You can implement an LRU Cache in Java by extending LinkedHashMap and overriding removeEldestEntry to limit cache size, like class LRUCache extends LinkedHashMap<Integer, Integer> { private int capacity; public LRUCache(int capacity) { super(capacity, 0.75f, true); this.capacity = capacity; } protected boolean removeEldestEntry(Map.Entry<Integer, Integer> eldest) { return size() > capacity; } }.
📋

Examples

InputLRUCache cache = new LRUCache(2); cache.put(1, 1); cache.put(2, 2); cache.get(1); cache.put(3, 3); cache.get(2);
Output1 -1
InputLRUCache cache = new LRUCache(1); cache.put(1, 10); cache.put(2, 20); cache.get(1); cache.get(2);
Output-1 20
InputLRUCache cache = new LRUCache(3); cache.put(1, 100); cache.put(2, 200); cache.put(3, 300); cache.get(2); cache.put(4, 400); cache.get(1);
Output200 -1
🧠

How to Think About It

To build an LRU Cache, think of a container that remembers the order of usage. When you add or get an item, it becomes the most recently used. If the cache is full, remove the least recently used item. Using a data structure that keeps order and allows quick access helps, like Java's LinkedHashMap with access order enabled.
📐

Algorithm

1
Create a class extending LinkedHashMap with access order set to true.
2
Set a fixed capacity for the cache.
3
Override the method to remove the oldest entry when size exceeds capacity.
4
On put or get, the accessed item moves to the most recent position automatically.
5
Return the value for get if present, else return -1.
💻

Code

java
import java.util.LinkedHashMap;
import java.util.Map;

class LRUCache extends LinkedHashMap<Integer, Integer> {
    private final int capacity;

    public LRUCache(int capacity) {
        super(capacity, 0.75f, true);
        this.capacity = capacity;
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry<Integer, Integer> eldest) {
        return size() > capacity;
    }

    public int get(int key) {
        return super.getOrDefault(key, -1);
    }

    public void put(int key, int value) {
        super.put(key, value);
    }

    public static void main(String[] args) {
        LRUCache cache = new LRUCache(2);
        cache.put(1, 1);
        cache.put(2, 2);
        System.out.println(cache.get(1)); // returns 1
        cache.put(3, 3);                  // evicts key 2
        System.out.println(cache.get(2)); // returns -1 (not found)
    }
}
Output
1 -1
🔍

Dry Run

Let's trace the example where cache capacity is 2 and we put keys 1 and 2, then get key 1, put key 3, and get key 2.

1

Put key 1

Cache: {1=1}

2

Put key 2

Cache: {1=1, 2=2}

3

Get key 1

Returns 1, moves key 1 to most recent Cache order: {2=2, 1=1}

4

Put key 3

Cache full, evicts least recently used key 2 Cache: {1=1, 3=3}

5

Get key 2

Key 2 not found, returns -1

OperationCache ContentReturned Value
put(1,1){1=1}N/A
put(2,2){1=1, 2=2}N/A
get(1){2=2, 1=1}1
put(3,3){1=1, 3=3}N/A
get(2){1=1, 3=3}-1
💡

Why This Works

Step 1: Extending LinkedHashMap

We extend LinkedHashMap with access order enabled so it keeps track of usage order automatically.

Step 2: Capacity Control

By overriding removeEldestEntry, we remove the oldest entry when the cache size exceeds the set capacity.

Step 3: Get and Put Operations

The get method returns the value or -1 if not found, and put adds or updates entries, moving them to the most recent position.

🔄

Alternative Approaches

Using HashMap and Doubly Linked List
java
import java.util.HashMap;

class LRUCache {
    class Node {
        int key, value;
        Node prev, next;
        Node(int k, int v) { key = k; value = v; }
    }

    private final int capacity;
    private HashMap<Integer, Node> map;
    private Node head, tail;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        map = new HashMap<>();
        head = new Node(0, 0);
        tail = new Node(0, 0);
        head.next = tail;
        tail.prev = head;
    }

    private void remove(Node node) {
        node.prev.next = node.next;
        node.next.prev = node.prev;
    }

    private void addToFront(Node node) {
        node.next = head.next;
        node.prev = head;
        head.next.prev = node;
        head.next = node;
    }

    public int get(int key) {
        if (!map.containsKey(key)) return -1;
        Node node = map.get(key);
        remove(node);
        addToFront(node);
        return node.value;
    }

    public void put(int key, int value) {
        if (map.containsKey(key)) {
            remove(map.get(key));
        }
        Node node = new Node(key, value);
        addToFront(node);
        map.put(key, node);
        if (map.size() > capacity) {
            Node lru = tail.prev;
            remove(lru);
            map.remove(lru.key);
        }
    }
}
This approach uses a HashMap for fast lookup and a doubly linked list to track usage order manually, offering O(1) operations but more code complexity.
Using LinkedHashMap without subclassing
java
import java.util.LinkedHashMap;
import java.util.Map;

public class LRUCache {
    private LinkedHashMap<Integer, Integer> cache;
    private int capacity;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        this.cache = new LinkedHashMap<Integer, Integer>(capacity, 0.75f, true) {
            @Override
            protected boolean removeEldestEntry(Map.Entry<Integer, Integer> eldest) {
                return size() > LRUCache.this.capacity;
            }
        };
    }

    public int get(int key) {
        return cache.getOrDefault(key, -1);
    }

    public void put(int key, int value) {
        cache.put(key, value);
    }
}
This uses an anonymous subclass of LinkedHashMap inside the LRUCache class, keeping code concise but less reusable.

Complexity: O(1) time, O(capacity) space

Time Complexity

All get and put operations run in constant time because LinkedHashMap provides O(1) access and update with access order.

Space Complexity

The cache stores up to capacity entries, so space grows linearly with capacity.

Which Approach is Fastest?

Using LinkedHashMap is simpler and efficient for most cases; the HashMap + doubly linked list approach is more complex but also O(1) and useful if you want full control.

ApproachTimeSpaceBest For
LinkedHashMap subclassO(1)O(capacity)Simple and fast implementation
HashMap + Doubly Linked ListO(1)O(capacity)Full control, manual order management
LinkedHashMap anonymous subclassO(1)O(capacity)Concise code, less reusable
💡
Use LinkedHashMap with accessOrder=true and override removeEldestEntry for a simple LRU cache.
⚠️
Forgetting to set accessOrder=true in LinkedHashMap causes the cache to not track usage order correctly.