Top K Frequent Elements Using Heap in DSA Go - Time & Space Complexity
We want to know how the time needed grows when finding the top K frequent elements using a heap.
How does the work change as the input list gets bigger?
Analyze the time complexity of the following code snippet.
func topKFrequent(nums []int, k int) []int {
freqMap := make(map[int]int)
for _, num := range nums {
freqMap[num]++
}
h := &IntHeap{}
heap.Init(h)
for num, freq := range freqMap {
heap.Push(h, Pair{num, freq})
if h.Len() > k {
heap.Pop(h)
}
}
res := make([]int, 0, k)
for h.Len() > 0 {
res = append(res, heap.Pop(h).(Pair).num)
}
return res
}
// Pair and IntHeap types omitted for brevity
This code counts frequencies, then uses a min-heap to keep only the top K frequent elements.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Loop over all numbers to count frequencies, then loop over unique numbers to push/pop in heap.
- How many times: Counting loop runs n times (n = length of input), heap operations run m times (m = number of unique elements).
As input size n grows, counting frequencies takes about n steps. Then, heap operations depend on m, the unique count.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | ~10 counting + ~10 heap ops |
| 100 | ~100 counting + ~100 heap ops |
| 1000 | ~1000 counting + ~1000 heap ops |
Pattern observation: Counting grows linearly with n, heap operations grow with unique elements m, usually less than n.
Time Complexity: O(n + m log k)
This means the time grows linearly with input size n for counting, and heap operations depend on m and k, keeping it efficient.
[X] Wrong: "Heap operations run n times, so time is O(n log n)."
[OK] Correct: Heap size is limited to k, so each push/pop is O(log k), not O(log n), making it faster.
Understanding how to use heaps to keep top K items efficiently is a valuable skill that shows you can handle large data smartly.
"What if we used a max-heap for all elements and then popped k times? How would the time complexity change?"