Detect Cycle in Linked List Floyd's Algorithm in DSA C - Time & Space Complexity
We want to understand how long it takes to check if a linked list has a cycle using Floyd's Algorithm.
The question is: how does the time needed grow as the list gets longer?
Analyze the time complexity of the following code snippet.
// Detect cycle using Floyd's Algorithm
int hasCycle(Node* head) {
Node *slow = head, *fast = head;
while (fast != NULL && fast->next != NULL) {
slow = slow->next;
fast = fast->next->next;
if (slow == fast) {
return 1; // cycle found
}
}
return 0; // no cycle
}
This code moves two pointers through the list at different speeds to find a cycle.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The while loop that moves the slow pointer one step and the fast pointer two steps each time.
- How many times: At most, the loop runs proportional to the number of nodes in the list before detecting a cycle or reaching the end.
As the list size grows, the number of steps to find a cycle or confirm none grows roughly in a straight line.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 steps |
| 100 | About 100 steps |
| 1000 | About 1000 steps |
Pattern observation: The steps increase directly with the number of nodes, not faster or slower.
Time Complexity: O(n)
This means the time to detect a cycle grows linearly with the list size.
[X] Wrong: "Because the fast pointer moves twice as fast, the algorithm runs in half the time, so it is O(1)."
[OK] Correct: Even though the fast pointer moves faster, the number of steps still depends on the list length, so the time grows linearly, not constant.
Understanding this algorithm's time complexity shows you can analyze pointer-based algorithms and reason about loops, a key skill in coding interviews.
"What if we changed the fast pointer to move three steps at a time? How would the time complexity change?"
