Doubly Linked List Structure and Node Design in DSA Python - Time & Space Complexity
We want to understand how the time to create and link nodes in a doubly linked list grows as we add more nodes.
How does the work change when the list gets bigger?
Analyze the time complexity of the following code snippet.
class Node:
def __init__(self, data):
self.data = data
self.prev = None
self.next = None
class DoublyLinkedList:
def __init__(self):
self.head = None
def append(self, data):
new_node = Node(data)
if not self.head:
self.head = new_node
return
current = self.head
while current.next:
current = current.next
current.next = new_node
new_node.prev = current
This code creates nodes and adds them to the end of a doubly linked list.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The while loop that moves through the list to find the last node.
- How many times: It runs once for each node already in the list before adding the new one.
As the list grows, the time to add a new node grows because we must walk through all existing nodes to reach the end.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 steps to find the end |
| 100 | About 100 steps to find the end |
| 1000 | About 1000 steps to find the end |
Pattern observation: The work grows directly with the number of nodes already in the list.
Time Complexity: O(n)
This means adding a node takes longer as the list gets bigger, because we must walk through all nodes to find the end.
[X] Wrong: "Adding a node is always fast and takes the same time no matter how big the list is."
[OK] Correct: Because without keeping track of the last node, we must walk through the whole list each time, so it takes longer as the list grows.
Understanding how linked list operations scale helps you explain your code choices clearly and shows you know how data structures behave in real situations.
"What if we kept a pointer to the last node in the list? How would the time complexity of adding a node change?"