Level-order traversal (BFS) in Data Structures Theory - Time & Space Complexity
We want to understand how the time needed to visit all nodes in a tree grows as the tree gets bigger.
Specifically, how does the level-order traversal (BFS) behave when the tree size increases?
Analyze the time complexity of the following code snippet.
function levelOrderTraversal(root) {
if (!root) return [];
const queue = [root];
const result = [];
while (queue.length > 0) {
const node = queue.shift();
result.push(node.value);
if (node.left) queue.push(node.left);
if (node.right) queue.push(node.right);
}
return result;
}
This code visits every node in a binary tree, level by level, from left to right.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Visiting each node once and adding its children to the queue.
- How many times: Exactly once per node in the tree.
As the number of nodes grows, the number of visits grows at the same rate.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 visits |
| 100 | About 100 visits |
| 1000 | About 1000 visits |
Pattern observation: The work grows directly in proportion to the number of nodes.
Time Complexity: O(n)
This means the time to complete the traversal grows linearly with the number of nodes in the tree.
[X] Wrong: "Because there is a loop inside a loop, the time must be quadratic (O(n²))."
[OK] Correct: The inner loop is not nested over the entire input; each node is processed only once, so the total work is linear, not squared.
Understanding BFS time complexity helps you explain how efficiently you can explore all nodes in a tree or graph, a common skill in many coding challenges.
"What if we changed the tree to a graph with cycles and did not track visited nodes? How would the time complexity change?"