Merge K Sorted Lists Using Min Heap in DSA Go - Time & Space Complexity
We want to know how long it takes to merge multiple sorted lists using a min heap.
Specifically, how does the time grow as the number of lists and their sizes increase?
Analyze the time complexity of the following code snippet.
import "container/heap"
type Item struct {
value int
listIndex int
elementIndex int
}
type IntHeap []*Item
func (h IntHeap) Len() int { return len(h) }
func (h IntHeap) Less(i, j int) bool { return h[i].value < h[j].value }
func (h IntHeap) Swap(i, j int) { h[i], h[j] = h[j], h[i] }
func (h *IntHeap) Push(x interface{}) {
*h = append(*h, x.(*Item))
}
func (h *IntHeap) Pop() interface{} {
old := *h
n := len(old)
item := old[n-1]
*h = old[0 : n-1]
return item
}
func mergeKLists(lists [][]int) []int {
minHeap := &IntHeap{}
heap.Init(minHeap)
for i, list := range lists {
if len(list) > 0 {
heap.Push(minHeap, &Item{value: list[0], listIndex: i, elementIndex: 0})
}
}
var result []int
for minHeap.Len() > 0 {
item := heap.Pop(minHeap).(*Item)
result = append(result, item.value)
if item.elementIndex+1 < len(lists[item.listIndex]) {
nextIndex := item.elementIndex + 1
heap.Push(minHeap, &Item{value: lists[item.listIndex][nextIndex], listIndex: item.listIndex, elementIndex: nextIndex})
}
}
return result
}
This code merges k sorted lists into one sorted list using a min heap to always pick the smallest next element.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Removing the smallest element from the heap and adding the next element from the same list.
- How many times: This happens once for every element across all lists, so total n times where n is the total number of elements.
Each element is pushed and popped from the heap once. The heap size is at most k (number of lists).
| Input Size (n) | Approx. Operations |
|---|---|
| 10 (k=3) | About 10 push/pop operations on heap of size ≤ 3 |
| 100 (k=5) | About 100 push/pop operations on heap of size ≤ 5 |
| 1000 (k=10) | About 1000 push/pop operations on heap of size ≤ 10 |
Pattern observation: The number of operations grows linearly with total elements, but each heap operation depends on k, which is usually much smaller than n.
Time Complexity: O(n log k)
This means the time grows mostly with the total number of elements n, but each step involves a quick operation on a heap of size k.
[X] Wrong: "The time complexity is O(n log n) because we use a heap for all elements."
[OK] Correct: The heap never holds all n elements at once, only up to k elements, so the log factor depends on k, not n.
Understanding this time complexity helps you explain efficient merging of sorted data streams, a common real-world problem.
"What if we used a simple list instead of a min heap to find the smallest element each time? How would the time complexity change?"