Why load balancing matters in AWS - Performance Analysis
We want to understand how the work done by load balancers changes as more users or requests come in.
How does the number of requests affect the load balancer's operations?
Analyze the time complexity of the following operation sequence.
// Create a load balancer
aws elbv2 create-load-balancer --name my-load-balancer --subnets subnet-12345 subnet-67890
// Register targets (servers) to the load balancer
aws elbv2 register-targets --target-group-arn arn:aws:elasticloadbalancing:... --targets Id=i-1234567890abcdef0 Id=i-0abcdef1234567890
// Incoming requests are distributed by the load balancer
// Each request is routed to one of the registered targets
This sequence sets up a load balancer and registers servers to handle incoming requests evenly.
- Primary operation: Routing each incoming request to a target server.
- How many times: Once per incoming request.
As the number of requests grows, the load balancer routes each one individually.
| Input Size (n) | Approx. API Calls/Operations |
|---|---|
| 10 | 10 routing operations |
| 100 | 100 routing operations |
| 1000 | 1000 routing operations |
Pattern observation: The number of routing operations grows directly with the number of requests.
Time Complexity: O(n)
This means the work the load balancer does grows in a straight line as requests increase.
[X] Wrong: "The load balancer handles all requests at once, so time stays the same no matter how many requests come in."
[OK] Correct: Each request needs to be routed separately, so more requests mean more work.
Understanding how load balancers handle growing traffic helps you design systems that stay fast and reliable as more people use them.
"What if the load balancer had to check the health of each target before routing every request? How would the time complexity change?"