Why container services matter on AWS - Performance Analysis
We want to understand how the time to run container services on AWS grows as we add more containers or tasks.
How does the number of containers affect the work AWS does behind the scenes?
Analyze the time complexity of starting multiple containers using AWS ECS.
for (int i = 0; i < n; i++) {
ecs.runTask({
cluster: 'myCluster',
taskDefinition: 'myTaskDef',
count: 1
});
}
This sequence runs n container tasks one by one on an ECS cluster.
Look at what repeats as we add more containers.
- Primary operation: The API call to start a container task (ecs.runTask).
- How many times: Exactly n times, once per container.
Each new container adds one more API call to start it.
| Input Size (n) | Approx. API Calls/Operations |
|---|---|
| 10 | 10 calls |
| 100 | 100 calls |
| 1000 | 1000 calls |
Pattern observation: The number of operations grows directly with the number of containers.
Time Complexity: O(n)
This means the time to start containers grows in a straight line as you add more containers.
[X] Wrong: "Starting multiple containers happens all at once, so time stays the same no matter how many containers."
[OK] Correct: Each container requires a separate API call and setup, so more containers mean more work and more time.
Understanding how container start time grows helps you design scalable systems and explain your choices clearly in interviews.
"What if we used a batch API to start multiple containers at once? How would the time complexity change?"