Morris Traversal Inorder Without Stack in DSA Go - Time & Space Complexity
We want to understand how the Morris Traversal algorithm performs when it visits nodes in a binary tree without using extra memory for a stack.
How does the number of steps grow as the tree gets bigger?
Analyze the time complexity of the following code snippet.
func morrisInorder(root *Node) {
current := root
for current != nil {
if current.Left == nil {
// Visit current node
current = current.Right
} else {
predecessor := current.Left
for predecessor.Right != nil && predecessor.Right != current {
predecessor = predecessor.Right
}
if predecessor.Right == nil {
predecessor.Right = current
current = current.Left
} else {
predecessor.Right = nil
// Visit current node
current = current.Right
}
}
}
}
This code visits all nodes of a binary tree in order without using a stack or recursion by temporarily modifying the tree links.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The outer loop visits each node, and the inner loop finds the rightmost node in the left subtree (predecessor).
- How many times: Each edge in the tree is visited at most twice: once to create a temporary link and once to remove it.
As the number of nodes (n) increases, the algorithm visits each node and its edges a limited number of times.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 20 visits (nodes and edges) |
| 100 | About 200 visits |
| 1000 | About 2000 visits |
Pattern observation: The total steps grow roughly twice the number of nodes, so the growth is linear.
Time Complexity: O(n)
This means the algorithm visits each node a small fixed number of times, so the total work grows directly with the number of nodes.
[X] Wrong: "Because of the inner loop, the algorithm must be O(n²) because it looks like nested loops."
[OK] Correct: Each edge is visited at most twice, so the inner loop does not run fully for every node. This keeps the total work linear, not quadratic.
Understanding Morris Traversal shows your skill in clever tree algorithms that save memory while keeping speed. This is a great example of thinking beyond usual methods like stacks or recursion.
"What if we changed the tree to be a linked list (all nodes only have right children)? How would the time complexity change?"