Deque (double-ended queue) in Data Structures Theory - Time & Space Complexity
We want to understand how fast operations on a deque happen as it grows larger.
Specifically, how does the time to add or remove items change when the deque size increases?
Analyze the time complexity of the following deque operations.
class Deque:
def __init__(self):
self.items = []
def add_front(self, item):
self.items.insert(0, item)
def add_rear(self, item):
self.items.append(item)
def remove_front(self):
return self.items.pop(0)
def remove_rear(self):
return self.items.pop()
This code shows basic deque operations: adding or removing items from both ends.
Look at what happens when we add or remove items.
- Primary operation: Inserting or removing at the front requires shifting elements.
- How many times: For each front operation, all existing items may move once.
- Adding or removing at the rear is simple and does not shift items.
Adding or removing at the rear stays fast no matter the size.
Adding or removing at the front takes longer as the deque grows because items shift.
| Input Size (n) | Approx. Operations for Front Insert/Remove |
|---|---|
| 10 | 10 shifts |
| 100 | 100 shifts |
| 1000 | 1000 shifts |
Pattern observation: Front operations take longer roughly proportional to the number of items.
Time Complexity: O(n) for front insert/remove, O(1) for rear insert/remove
This means adding or removing at the rear is always quick, but at the front it gets slower as the deque grows.
[X] Wrong: "All deque operations take the same constant time."
[OK] Correct: Using a list, front operations need to move many items, so they take longer as size grows.
Understanding how different deque operations scale helps you choose the right data structure and explain your choices clearly.
"What if we used a linked list instead of a list to implement the deque? How would the time complexity change?"