0
0
Nginxdevops~5 mins

Proxy cache key in Nginx - Commands & Configuration

Choose your learning style9 modes available
Introduction
When using a proxy server to speed up web content delivery, it saves copies of responses in a cache. The proxy cache key decides how the server identifies each unique cached item. This helps avoid sending the same request to the backend repeatedly.
When you want to speed up your website by caching responses from a backend server.
When you want to reduce load on your backend by serving repeated requests from cache.
When you want to customize which parts of a request determine the cached content, like ignoring cookies or query strings.
When you want to cache different versions of a page based on user language or device type.
When you want to troubleshoot or optimize cache hits by controlling the cache key.
Config File - nginx.conf
nginx.conf
http {
    proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=1g inactive=60m use_temp_path=off;

    server {
        listen 80;

        location / {
            proxy_cache my_cache;
            proxy_cache_key "$scheme://$host$request_uri";
            proxy_pass http://backend_server;
        }
    }
}

proxy_cache_path defines where and how cache is stored.

proxy_cache enables caching for the location.

proxy_cache_key sets the unique key for each cached response, here combining scheme, host, and full request URI.

This setup caches responses separately for http/https, different hosts, and request URIs.

Commands
Check the nginx configuration file for syntax errors before reloading.
Terminal
sudo nginx -t
Expected OutputExpected
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok nginx: configuration file /etc/nginx/nginx.conf test is successful
Reload nginx to apply the new configuration without downtime.
Terminal
sudo systemctl reload nginx
Expected OutputExpected
No output (command runs silently)
Send a request to the nginx server to check response headers and confirm caching behavior.
Terminal
curl -I http://localhost/
Expected OutputExpected
HTTP/1.1 200 OK Server: nginx Date: Tue, 27 Jun 2024 12:00:00 GMT Content-Type: text/html Content-Length: 612 Last-Modified: Tue, 27 Jun 2024 11:50:00 GMT Connection: keep-alive X-Cache-Status: MISS
Send the same request again to verify the response is served from cache this time.
Terminal
curl -I http://localhost/
Expected OutputExpected
HTTP/1.1 200 OK Server: nginx Date: Tue, 27 Jun 2024 12:00:05 GMT Content-Type: text/html Content-Length: 612 Last-Modified: Tue, 27 Jun 2024 11:50:00 GMT Connection: keep-alive X-Cache-Status: HIT
Key Concept

If you remember nothing else from this pattern, remember: the proxy cache key defines how nginx identifies unique cached responses, so it controls cache hits and misses.

Common Mistakes
Not setting proxy_cache_key explicitly and relying on default.
The default key may include unwanted parts like cookies, causing cache misses or storing too many copies.
Define proxy_cache_key to include only the parts of the request that should differentiate cached content.
Including variables that change every request, like $request_time, in proxy_cache_key.
This causes every request to be treated as unique, preventing caching.
Use stable variables like $scheme, $host, and $request_uri for the cache key.
Not testing nginx configuration with nginx -t before reload.
Syntax errors can cause nginx to fail to reload, leading to downtime.
Always run sudo nginx -t to verify config before reloading.
Summary
Define proxy_cache_key in nginx.conf to control how cached responses are identified.
Test nginx configuration syntax with nginx -t before reloading to avoid errors.
Reload nginx to apply changes and verify caching behavior with repeated curl requests.