Doubly linked list in Data Structures Theory - Time & Space Complexity
When working with doubly linked lists, it is important to understand how the time to perform operations changes as the list grows.
We want to know how fast we can add, remove, or find items as the list gets bigger.
Analyze the time complexity of the following code snippet.
class Node:
def __init__(self, data):
self.data = data
self.prev = None
self.next = None
class DoublyLinkedList:
def __init__(self):
self.head = None
def append(self, data):
new_node = Node(data)
if not self.head:
self.head = new_node
return
current = self.head
while current.next:
current = current.next
current.next = new_node
new_node.prev = current
This code adds a new item to the end of a doubly linked list by walking through the list to find the last node.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The while loop that moves through each node until the end.
- How many times: It runs once for each node in the list until the last one is found.
As the list gets longer, the time to find the end grows roughly in direct proportion to the number of items.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 steps to reach the end |
| 100 | About 100 steps to reach the end |
| 1000 | About 1000 steps to reach the end |
Pattern observation: The time grows linearly as the list size increases.
Time Complexity: O(n)
This means the time to add an item at the end grows directly with the number of items in the list.
[X] Wrong: "Adding to the end of a doubly linked list always takes constant time because we can go backwards."
[OK] Correct: Without keeping a reference to the last node, we must walk from the start to find the end, which takes time proportional to the list size.
Understanding how linked list operations scale helps you explain your choices clearly and shows you know how data structures behave in real situations.
"What if we kept a reference to the tail node? How would the time complexity of appending change?"