Min Stack Design in DSA Python - Time & Space Complexity
We want to understand how fast the Min Stack operations run as the number of elements grows.
Specifically, how the time to push, pop, or get the minimum changes with more items.
Analyze the time complexity of the following Min Stack implementation.
class MinStack:
def __init__(self):
self.stack = []
self.min_stack = []
def push(self, val):
self.stack.append(val)
if not self.min_stack or val <= self.min_stack[-1]:
self.min_stack.append(val)
def pop(self):
if self.stack:
val = self.stack.pop()
if val == self.min_stack[-1]:
self.min_stack.pop()
def get_min(self):
return self.min_stack[-1] if self.min_stack else None
This code keeps track of the minimum value at each step using a second stack.
Look at the main operations that repeat when the stack changes.
- Primary operation: Adding or removing elements from the main and min stacks.
- How many times: Each push or pop runs once per call, no loops inside.
Each operation does a fixed number of steps regardless of stack size.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 push or pop steps |
| 100 | 100 push or pop steps |
| 1000 | 1000 push or pop steps |
Pattern observation: The time per operation stays the same no matter how big the stack is.
Time Complexity: O(1)
This means each push, pop, or get_min operation takes the same short time no matter how many items are in the stack.
[X] Wrong: "Finding the minimum must check all elements, so it takes longer as the stack grows."
[OK] Correct: The min stack keeps track of minimums as we go, so get_min just looks at the top of min_stack instantly.
Knowing how to keep track of minimums efficiently shows you can design smart data structures that save time.
This skill helps you solve problems where quick access to special values is needed.
"What if we stored minimums only when they change, not every time? How would that affect time complexity?"