Queue Using Two Stacks in DSA C - Time & Space Complexity
We want to understand how fast operations run when we build a queue using two stacks.
Specifically, how does the time to add or remove items change as the queue grows?
Analyze the time complexity of the following code snippet.
// Push element x to the back of queue
void enqueue(Stack* s1, int x) {
push(s1, x);
}
// Remove element from front of queue
int dequeue(Stack* s1, Stack* s2) {
if (isEmpty(s2)) {
while (!isEmpty(s1)) {
push(s2, pop(s1));
}
}
return pop(s2);
}
This code uses two stacks to create a queue. Enqueue adds to the first stack. Dequeue moves all items to the second stack if empty, then pops from it.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Moving all elements from the first stack to the second stack during dequeue when the second stack is empty.
- How many times: This moving happens only when the second stack is empty, not every dequeue call.
When the second stack is empty, dequeue moves all n elements once, which takes n steps. But this happens only after n enqueues.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 moves spread over dequeues |
| 100 | About 100 moves spread over dequeues |
| 1000 | About 1000 moves spread over dequeues |
Pattern observation: Although some dequeues take longer, on average each operation takes constant time because moves are spread out.
Time Complexity: O(1) amortized
This means each enqueue or dequeue takes constant time on average, even if some dequeues take longer occasionally.
[X] Wrong: "Every dequeue takes O(n) time because it moves all elements."
[OK] Correct: The moving happens only when the second stack is empty, so the cost is spread out over many dequeues, making average time constant.
Understanding amortized time helps you explain why some operations seem slow but are efficient overall, a key skill in coding interviews.
What if we used three stacks instead of two? How would the time complexity change?
