Counting Sort Algorithm in DSA Go - Time & Space Complexity
Counting Sort is a sorting method that uses counting to sort numbers quickly.
We want to know how the time it takes changes as the list gets bigger.
Analyze the time complexity of the following code snippet.
package main
func countingSort(arr []int, maxVal int) []int {
count := make([]int, maxVal+1)
for _, num := range arr {
count[num]++
}
index := 0
for num, freq := range count {
for freq > 0 {
arr[index] = num
index++
freq--
}
}
return arr
}
This code sorts an array of numbers by counting how many times each number appears, then rebuilding the sorted array.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Two loops: one to count frequencies, one to rebuild the sorted array.
- How many times: First loop runs once for each element in input (n times). Second loop runs once for each possible number (k times), and inside it runs as many times as the frequency of that number.
Explain the growth pattern intuitively.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 + k operations |
| 100 | About 100 + k operations |
| 1000 | About 1000 + k operations |
Pattern observation: The time grows mostly with the number of elements (n) plus the range of numbers (k).
Time Complexity: O(n + k)
This means the time to sort depends on both how many numbers you have and the range of those numbers.
[X] Wrong: "Counting Sort always runs in linear time O(n)."
[OK] Correct: The time also depends on the range of numbers (k). If k is very large compared to n, the algorithm can be slower.
Understanding Counting Sort's time complexity helps you explain when it is efficient and when it is not, showing your grasp of algorithm choices.
"What if the range of numbers (k) is much larger than the number of elements (n)? How would the time complexity change?"