0
0
Nginxdevops~5 mins

FastCGI cache in Nginx - Commands & Configuration

Choose your learning style9 modes available
Introduction
Web servers often need to speed up responses from dynamic applications. FastCGI cache stores responses from these apps so the server can quickly reuse them without running the app again.
When your website uses PHP or another FastCGI application and you want to reduce server load.
When you want faster page loads for visitors by serving cached content.
When you want to reduce the time your backend app spends generating the same content repeatedly.
When you want to handle more users without upgrading hardware by caching responses.
When you want to control how long cached content stays fresh before refreshing.
Config File - nginx.conf
nginx.conf
http {
    fastcgi_cache_path /var/cache/nginx/fastcgi_cache levels=1:2 keys_zone=MYCACHE:10m inactive=60m;
    fastcgi_cache_key "$scheme$request_method$host$request_uri";

    server {
        listen 80;
        server_name example.com;

        location ~ \.php$ {
            include fastcgi_params;
            fastcgi_pass unix:/run/php/php8.1-fpm.sock;

            fastcgi_cache MYCACHE;
            fastcgi_cache_valid 200 302 10m;
            fastcgi_cache_valid 404 1m;
            fastcgi_cache_use_stale error timeout invalid_header http_500;
            fastcgi_cache_min_uses 1;
            add_header X-Cache $upstream_cache_status;
        }
    }
}

fastcgi_cache_path sets where cache files are stored and how much memory is used for keys.

fastcgi_cache_key defines how requests are identified uniquely.

Inside server, the location ~ \.php$ block configures FastCGI caching for PHP files.

fastcgi_cache enables caching using the defined zone.

fastcgi_cache_valid sets how long responses with certain status codes are cached.

fastcgi_cache_use_stale allows serving stale cache if backend errors occur.

add_header X-Cache adds a header to show cache status in responses.

Commands
Check the nginx configuration file syntax to ensure no errors before reloading.
Terminal
sudo nginx -t
Expected OutputExpected
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok nginx: configuration file /etc/nginx/nginx.conf test is successful
Reload nginx to apply the new FastCGI cache configuration without stopping the server.
Terminal
sudo systemctl reload nginx
Expected OutputExpected
No output (command runs silently)
Make a request to the PHP page and check the response headers to see if caching is working.
Terminal
curl -I http://example.com/index.php
Expected OutputExpected
HTTP/1.1 200 OK Server: nginx Date: Thu, 01 Jun 2023 12:00:00 GMT Content-Type: text/html; charset=UTF-8 X-Cache: MISS
Make the same request again to verify the response is served from cache this time.
Terminal
curl -I http://example.com/index.php
Expected OutputExpected
HTTP/1.1 200 OK Server: nginx Date: Thu, 01 Jun 2023 12:00:05 GMT Content-Type: text/html; charset=UTF-8 X-Cache: HIT
Key Concept

If you remember nothing else from this pattern, remember: FastCGI cache stores dynamic responses on disk so nginx can serve them quickly without running the backend app again.

Common Mistakes
Not testing nginx configuration before reload
Reloading with syntax errors causes nginx to fail and stop serving requests.
Always run 'nginx -t' to check config syntax before reloading.
Not setting a unique fastcgi_cache_key
Cache may serve wrong content for different URLs or request methods.
Include scheme, method, host, and URI in fastcgi_cache_key to uniquely identify requests.
Not adding X-Cache header to verify caching
You cannot easily tell if responses come from cache or backend.
Add 'add_header X-Cache $upstream_cache_status;' to see cache status in response headers.
Summary
Configure fastcgi_cache_path and fastcgi_cache_key in nginx.conf to enable caching.
Use fastcgi_cache and fastcgi_cache_valid inside location blocks to control caching behavior.
Test nginx config syntax with 'nginx -t' before reloading to avoid downtime.
Verify caching works by checking the X-Cache header in HTTP responses.