0
0
PythonProgramBeginner · 2 min read

Python Program to Implement LRU Cache

You can implement an LRU cache in Python easily using @functools.lru_cache(maxsize=3) decorator on a function to cache its results with a limit of 3 items.
📋

Examples

InputCall fib(3)
Output2
InputCall fib(5)
Output5
InputCall fib(10)
Output55
🧠

How to Think About It

To implement an LRU cache, you want to store results of expensive function calls so that if the same inputs come again, you return the stored result quickly. The cache should keep only a limited number of recent results and remove the least recently used ones when full. Python's functools module provides a ready-made decorator called lru_cache that handles this automatically.
📐

Algorithm

1
Define the function you want to cache results for.
2
Apply the @lru_cache decorator with a max size limit.
3
When the function is called, check if the result is in cache.
4
If cached, return the stored result immediately.
5
If not cached, compute the result, store it, and return it.
6
When cache is full, remove the least recently used entry.
💻

Code

python
from functools import lru_cache

@lru_cache(maxsize=3)
def fib(n):
    if n < 2:
        return n
    return fib(n-1) + fib(n-2)

print(fib(3))  # Output: 2
print(fib(5))  # Output: 5
print(fib(10)) # Output: 55
Output
2 5 55
🔍

Dry Run

Let's trace fib(3) through the code with LRU cache max size 3.

1

Call fib(3)

Cache is empty, compute fib(3) = fib(2) + fib(1)

2

Compute fib(2)

Cache miss, compute fib(2) = fib(1) + fib(0)

3

Compute fib(1) and fib(0)

fib(1) = 1, fib(0) = 0, store both in cache

4

Return fib(2)

fib(2) = 1 + 0 = 1, store in cache

5

Return fib(3)

fib(3) = fib(2) + fib(1) = 1 + 1 = 2, store in cache

CallCache Contents
fib(1){fib(1):1}
fib(0){fib(1):1, fib(0):0}
fib(2){fib(1):1, fib(0):0, fib(2):1}
fib(3){fib(0):0, fib(2):1, fib(3):2} (evicts oldest if maxsize exceeded)
💡

Why This Works

Step 1: Using @lru_cache decorator

The @lru_cache decorator automatically caches function results and manages the cache size.

Step 2: Cache stores recent calls

It keeps only the most recent maxsize calls and removes the least recently used when full.

Step 3: Faster repeated calls

When the function is called again with the same argument, it returns the cached result instantly without recomputing.

🔄

Alternative Approaches

Manual LRU Cache with OrderedDict
python
from collections import OrderedDict

class LRUCache:
    def __init__(self, capacity):
        self.cache = OrderedDict()
        self.capacity = capacity

    def get(self, key):
        if key not in self.cache:
            return -1
        self.cache.move_to_end(key)
        return self.cache[key]

    def put(self, key, value):
        if key in self.cache:
            self.cache.move_to_end(key)
        self.cache[key] = value
        if len(self.cache) > self.capacity:
            self.cache.popitem(last=False)

cache = LRUCache(2)
cache.put(1, 1)
cache.put(2, 2)
print(cache.get(1))  # 1
cache.put(3, 3)
print(cache.get(2))  # -1 (evicted)
This approach gives full control over cache behavior but requires more code and manual management.
Using functools.cache (Python 3.9+)
python
from functools import cache

@cache
def fib(n):
    if n < 2:
        return n
    return fib(n-1) + fib(n-2)

print(fib(10))  # 55
This caches all calls without size limit, which can cause high memory use for large inputs.

Complexity: O(1) average for cache hits, O(n) for cache misses in recursive calls time, O(maxsize) for storing cached results space

Time Complexity

Cache hits return results instantly in O(1) time. Cache misses compute recursively, which can be expensive but reduced by caching.

Space Complexity

The cache stores up to maxsize results, so space grows linearly with cache size.

Which Approach is Fastest?

Using @lru_cache is fastest and simplest for most cases. Manual cache offers control but more code. Unlimited cache uses more memory.

ApproachTimeSpaceBest For
functools.lru_cacheO(1) cache hitO(maxsize)Simple, efficient caching
Manual OrderedDictO(1) cache hitO(capacity)Custom cache behavior
functools.cacheO(1) cache hitO(n)Unlimited cache, simple but memory heavy
💡
Use @lru_cache decorator to add caching easily without extra code.
⚠️
Forgetting to set maxsize can cause the cache to grow indefinitely and use too much memory.