How to Configure Caching in Nginx for Faster Web Performance
To configure caching in
nginx, use the proxy_cache_path directive to define cache storage and proxy_cache inside a server or location block to enable caching. This setup stores responses and serves them quickly for repeated requests, reducing load and improving speed.Syntax
The main directives to configure caching in nginx are:
proxy_cache_path: Defines where and how cached files are stored.proxy_cache: Enables caching for a specific location or server block.proxy_cache_key: Defines the key used to store cache entries (usually the request URL).proxy_cache_valid: Sets how long cached responses are considered valid.proxy_cache_use_stale: Allows serving stale content if the backend is down.
nginx
proxy_cache_path /path/to/cache levels=1:2 keys_zone=my_cache:10m max_size=1g inactive=60m use_temp_path=off; server { location / { proxy_cache my_cache; proxy_cache_key "$scheme$request_method$host$request_uri"; proxy_cache_valid 200 302 10m; proxy_cache_valid 404 1m; proxy_cache_use_stale error timeout updating; proxy_pass http://backend_server; } }
Example
This example shows a simple caching setup where responses from http://backend_server are cached on disk for 10 minutes for successful requests. It improves response time by serving cached content for repeated requests.
nginx
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=500m inactive=30m use_temp_path=off; server { listen 80; server_name example.com; location / { proxy_cache my_cache; proxy_cache_key "$scheme$request_method$host$request_uri"; proxy_cache_valid 200 10m; proxy_cache_valid 404 1m; proxy_cache_use_stale error timeout updating; proxy_pass http://backend_server; } }
Output
Nginx starts and caches responses from backend_server in /var/cache/nginx.
Repeated requests to example.com serve cached content for 10 minutes.
Common Pitfalls
Common mistakes when configuring caching in nginx include:
- Not setting
proxy_cache_pathcorrectly, causing cache storage errors. - Using an insufficient
keys_zonesize, which limits cache metadata storage. - Forgetting to set
proxy_cache_valid, so cached content expires immediately. - Not handling stale content with
proxy_cache_use_stale, leading to errors when backend is down. - Incorrect
proxy_cache_keycausing cache misses.
Example of a wrong and right way:
nginx
# Wrong: Missing proxy_cache_path
server {
location / {
proxy_cache my_cache;
proxy_pass http://backend_server;
}
}
# Right: Define proxy_cache_path and enable caching
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=500m inactive=30m use_temp_path=off;
server {
location / {
proxy_cache my_cache;
proxy_cache_valid 200 10m;
proxy_pass http://backend_server;
}
}Quick Reference
| Directive | Purpose | Example Value |
|---|---|---|
| proxy_cache_path | Defines cache storage location and parameters | /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=1g inactive=60m |
| proxy_cache | Enables caching using a named cache zone | my_cache |
| proxy_cache_key | Sets the key to identify cached content | "$scheme$request_method$host$request_uri" |
| proxy_cache_valid | Sets cache duration for response codes | 200 302 10m; 404 1m |
| proxy_cache_use_stale | Serve stale content on backend errors | error timeout updating |
Key Takeaways
Define cache storage with proxy_cache_path before enabling caching.
Use proxy_cache inside server/location blocks to activate caching.
Set proxy_cache_valid to control how long responses stay cached.
Use proxy_cache_key to avoid cache misses by uniquely identifying requests.
Enable proxy_cache_use_stale to serve cached content during backend failures.