Heapify operation in Data Structures Theory - Time & Space Complexity
Heapify is a key step in building and maintaining a heap data structure.
We want to understand how the time to fix the heap changes as the heap size grows.
Analyze the time complexity of the heapify function below.
function heapify(arr, n, i) {
let largest = i;
let left = 2 * i + 1;
let right = 2 * i + 2;
if (left < n && arr[left] > arr[largest]) {
largest = left;
}
if (right < n && arr[right] > arr[largest]) {
largest = right;
}
if (largest != i) {
swap(arr, i, largest);
heapify(arr, n, largest);
}
}
This code fixes the heap property at index i by moving the value down the tree if needed.
Look for repeated actions that affect time.
- Primary operation: Recursive calls moving down the heap tree.
- How many times: At most, one call per level of the heap, from root to leaf.
The heap is a tree with height roughly log base 2 of the number of elements.
| Input Size (n) | Approx. Operations (levels) |
|---|---|
| 10 | About 4 |
| 100 | About 7 |
| 1000 | About 10 |
As the input size grows, the number of steps grows slowly, increasing by one each time the size doubles.
Time Complexity: O(log n)
This means the time to fix the heap grows slowly, proportional to the height of the heap tree.
[X] Wrong: "Heapify takes linear time because it looks at many elements."
[OK] Correct: Heapify only moves down one path in the tree, not all elements, so it takes time proportional to the tree height, not the total number of elements.
Understanding heapify's time complexity helps you explain how heaps efficiently maintain order, a skill valued in many coding challenges and real-world applications.
What if heapify was implemented iteratively instead of recursively? How would the time complexity change?