Common linked list patterns (runner technique) in Data Structures Theory - Time & Space Complexity
When working with linked lists, the runner technique helps us move through the list efficiently.
We want to know how the time needed changes as the list gets longer.
Analyze the time complexity of the following code snippet.
Node* slow = head;
Node* fast = head;
while (fast != nullptr && fast->next != nullptr) {
slow = slow->next;
fast = fast->next->next;
}
// slow now points to the middle node
This code uses two pointers moving at different speeds to find the middle of a linked list.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Moving the fast and slow pointers through the list.
- How many times: The loop runs about half the length of the list because fast moves two steps each time.
As the list gets longer, the loop runs more times, but only about half as many as the list length.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 5 |
| 100 | 50 |
| 1000 | 500 |
Pattern observation: The number of steps grows roughly in direct proportion to the list size.
Time Complexity: O(n)
This means the time to find the middle grows linearly as the list gets longer.
[X] Wrong: "Because the fast pointer moves two steps, the time complexity is O(n/2), which is faster than O(n)."
[OK] Correct: We ignore constants in time complexity, so O(n/2) is the same as O(n). The time still grows linearly with input size.
Understanding the runner technique shows you can handle linked lists efficiently, a common skill interviewers look for.
"What if the fast pointer moved three steps at a time instead of two? How would the time complexity change?"