Insert at End of Circular Linked List in DSA Python - Time & Space Complexity
We want to understand how the time to insert a new node at the end of a circular linked list changes as the list grows.
How does the number of steps needed grow when the list gets bigger?
Analyze the time complexity of the following code snippet.
class Node:
def __init__(self, data):
self.data = data
self.next = None
def insert_end(head, data):
new_node = Node(data)
if not head:
new_node.next = new_node
return new_node
temp = head
while temp.next != head:
temp = temp.next
temp.next = new_node
new_node.next = head
return head
This code inserts a new node at the end of a circular linked list by traversing the list to find the last node.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The while loop that moves through each node to find the last node.
- How many times: It runs once for each node in the list until it reaches the last node.
As the list gets longer, the loop runs more times because it checks each node to find the end.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 steps to find the end |
| 100 | About 100 steps to find the end |
| 1000 | About 1000 steps to find the end |
Pattern observation: The number of steps grows directly with the number of nodes in the list.
Time Complexity: O(n)
This means the time to insert at the end grows linearly with the size of the list.
[X] Wrong: "Inserting at the end is always constant time because we just add a node."
[OK] Correct: We must first find the last node by walking through the list, which takes time proportional to the list size.
Understanding this helps you explain how linked list operations scale and shows you can analyze real data structure behaviors clearly.
"What if we kept a pointer to the last node? How would the time complexity change?"