Enqueue Using Linked List in DSA Python - Time & Space Complexity
We want to understand how the time to add an item to a queue using a linked list changes as the queue grows.
How does the number of steps needed to add one item change when the queue gets bigger?
Analyze the time complexity of the following code snippet.
class Node:
def __init__(self, data):
self.data = data
self.next = None
class Queue:
def __init__(self):
self.front = None
self.rear = None
def enqueue(self, data):
new_node = Node(data)
if self.rear is None:
self.front = self.rear = new_node
return
self.rear.next = new_node
self.rear = new_node
This code adds a new item to the end of a queue implemented with a linked list.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Adding a new node by updating pointers.
- How many times: This happens once per enqueue call, no loops or traversals.
Adding one item always takes the same number of steps, no matter how big the queue is.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 5-6 steps |
| 100 | 5-6 steps |
| 1000 | 5-6 steps |
Pattern observation: The time to add one item stays about the same regardless of queue size.
Time Complexity: O(1)
This means adding an item takes a constant amount of time no matter how many items are already in the queue.
[X] Wrong: "Adding an item takes longer as the queue grows because we have to find the end each time."
[OK] Correct: Because we keep a pointer to the end (rear), we add the new item directly without searching, so time stays constant.
Knowing that enqueue in a linked list queue is fast and constant time shows you understand efficient data structure design, a key skill in interviews.
"What if we did not keep a rear pointer and had to find the end each time? How would the time complexity change?"