Which of the following best explains why rate limiting helps prevent abuse on a web server?
Think about how limiting requests helps keep the server stable.
Rate limiting controls how many requests a user can send in a given time. This stops users from sending too many requests and overwhelming the server, which prevents abuse like denial of service.
Given this nginx configuration snippet, what will be the output when a user exceeds the limit?
limit_req_zone $binary_remote_addr zone=mylimit:10m rate=5r/s;
server {
location /api/ {
limit_req zone=mylimit burst=3 nodelay;
}
}What HTTP status code indicates too many requests?
When the rate limit is exceeded, nginx returns HTTP 429 to tell the client to slow down.
Which nginx directive correctly sets a rate limit of 10 requests per second per IP address?
Consider the variable used to identify unique clients efficiently.
$binary_remote_addr is a compact binary representation of the client IP, preferred for performance. The rate=10r/s sets 10 requests per second.
A site using nginx rate limiting returns many HTTP 429 errors even though traffic is low. What is the most likely cause?
Think about how nginx identifies clients for rate limiting.
If nginx uses a variable that does not uniquely identify clients (like $remote_addr behind a proxy without proper headers), many users share the same limit, causing unexpected 429 errors.
You manage a public API with many users. You want to prevent abuse but allow short bursts of traffic. Which nginx configuration approach best fits this need?
Consider how bursts help handle sudden traffic spikes.
Setting a moderate rate with a burst allows users to send a few extra requests quickly without being blocked, which improves user experience while preventing abuse.