Get Length of Linked List in DSA Python - Time & Space Complexity
We want to understand how the time to find the length of a linked list changes as the list grows.
How does the number of steps increase when the list gets longer?
Analyze the time complexity of the following code snippet.
class Node:
def __init__(self, value):
self.value = value
self.next = None
def get_length(head):
count = 0
current = head
while current is not None:
count += 1
current = current.next
return count
This code counts how many nodes are in a linked list by moving from the first node to the last.
- Primary operation: The while loop that visits each node once.
- How many times: Exactly once per node in the list, so as many times as the list length.
Each new node adds one more step to count it, so the steps grow directly with the list size.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 steps |
| 100 | 100 steps |
| 1000 | 1000 steps |
Pattern observation: The number of steps grows in a straight line with the list size.
Time Complexity: O(n)
This means the time to find the length grows directly with how many nodes are in the list.
[X] Wrong: "The length can be found instantly without looking at each node."
[OK] Correct: Because a linked list does not store its size, you must check each node to count them.
Knowing how to analyze simple linked list operations like this builds a strong base for more complex problems.
"What if the linked list stored its length as a property updated on insertions? How would the time complexity change?"