Queue Implementation Using Linked List in DSA Python - Time & Space Complexity
We want to understand how fast the main operations of a queue work when it is built using a linked list.
Specifically, how the time to add or remove items changes as the queue grows.
Analyze the time complexity of the following code snippet.
class Node:
def __init__(self, data):
self.data = data
self.next = None
class Queue:
def __init__(self):
self.front = None
self.rear = None
def enqueue(self, data):
new_node = Node(data)
if self.rear is None:
self.front = self.rear = new_node
return
self.rear.next = new_node
self.rear = new_node
def dequeue(self):
if self.front is None:
return None
temp = self.front
self.front = temp.next
if self.front is None:
self.rear = None
return temp.data
This code adds items to the end and removes items from the front of a queue using a linked list.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Adding or removing a single node by updating pointers.
- How many times: Each enqueue or dequeue runs once per operation, no loops inside.
Each enqueue or dequeue operation takes the same amount of time no matter how many items are in the queue.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 1 per enqueue or dequeue |
| 100 | 1 per enqueue or dequeue |
| 1000 | 1 per enqueue or dequeue |
Pattern observation: The time stays constant regardless of queue size.
Time Complexity: O(1)
This means adding or removing an item takes the same short time no matter how big the queue is.
[X] Wrong: "Removing an item from the queue takes longer as the queue grows because it has to move through all nodes."
[OK] Correct: The queue keeps track of the front node, so removing just updates a pointer without searching through nodes.
Knowing that queue operations run in constant time shows you understand efficient data handling, a key skill in many coding challenges.
"What if we did not keep a rear pointer and had to find the end each time we enqueue? How would the time complexity change?"