Caching helps your FastAPI app respond faster by saving data it already got. This means less waiting for users and less work for your server.
0
0
Caching strategies in FastAPI
Introduction
When your API returns data that doesn't change often, like a list of countries.
When you want to reduce the load on your database by reusing recent query results.
When you want to speed up responses for popular requests that many users make.
When you want to temporarily store computed results to avoid repeating heavy calculations.
Syntax
FastAPI
from fastapi import FastAPI from fastapi_cache import FastAPICache from fastapi_cache.backends.inmemory import InMemoryBackend from fastapi_cache.decorator import cache app = FastAPI() @app.on_event("startup") async def startup(): FastAPICache.init(InMemoryBackend()) @app.get("/items/{item_id}") @cache(expire=60) async def read_item(item_id: int): # simulate slow operation return {"item_id": item_id, "value": "some data"}
The @cache decorator tells FastAPI to save the response for a set time (in seconds).
You must initialize the cache backend (like in-memory or Redis) when the app starts.
Examples
This caches the function result for 2 minutes.
FastAPI
from fastapi_cache.decorator import cache @cache(expire=120) async def get_data(): return {"data": "cached for 2 minutes"}
Setting expire=0 disables caching for this endpoint.
FastAPI
@cache(expire=0) async def get_fresh_data(): return {"data": "no caching"}
This example shows how to use Redis as a cache backend instead of in-memory.
FastAPI
import aioredis from fastapi_cache.backends.redis import RedisBackend @app.on_event("startup") async def startup(): redis = aioredis.from_url("redis://localhost") FastAPICache.init(RedisBackend(redis))
Sample Program
This FastAPI app has one endpoint that waits 2 seconds before returning data. But thanks to caching, if you call it again within 10 seconds, it returns instantly.
FastAPI
from fastapi import FastAPI from fastapi_cache import FastAPICache from fastapi_cache.backends.inmemory import InMemoryBackend from fastapi_cache.decorator import cache import asyncio app = FastAPI() @app.on_event("startup") async def startup(): FastAPICache.init(InMemoryBackend()) @app.get("/slow-data/{item_id}") @cache(expire=10) async def slow_data(item_id: int): await asyncio.sleep(2) # simulate slow operation return {"item_id": item_id, "info": "This data is cached for 10 seconds"}
OutputSuccess
Important Notes
In-memory cache is simple but resets when the app restarts.
For production, use Redis or another persistent cache backend.
Set cache expiration wisely to balance freshness and speed.
Summary
Caching saves repeated work and speeds up your FastAPI app.
Use the @cache decorator with an expiration time to enable caching.
Choose the right cache backend for your needs, like in-memory for testing or Redis for production.