Radix Sort Algorithm in DSA Go - Time & Space Complexity
We want to understand how the time taken by Radix Sort changes as the input size grows.
Specifically, how the number of digits and elements affect the work done.
Analyze the time complexity of the following Radix Sort code snippet.
func radixSort(arr []int) []int {
max := arr[0]
for _, v := range arr {
if v > max {
max = v
}
}
exp := 1
n := len(arr)
output := make([]int, n)
for max/exp > 0 {
count := make([]int, 10)
for i := 0; i < n; i++ {
count[(arr[i]/exp)%10]++
}
for i := 1; i < 10; i++ {
count[i] += count[i-1]
}
for i := n - 1; i >= 0; i-- {
output[count[(arr[i]/exp)%10]-1] = arr[i]
count[(arr[i]/exp)%10]--
}
copy(arr, output)
exp *= 10
}
return arr
}
This code sorts integers by processing digits from least to most significant.
Look for loops and repeated steps:
- Primary operation: Counting sort on each digit place.
- How many times: Once for each digit of the largest number.
- Inside each digit pass, loops run over all elements multiple times.
The work depends on two things: number of elements (n) and number of digits (d).
| Input Size (n) | Digits (d) | Approx. Operations |
|---|---|---|
| 10 | 2 | About 20 loops over elements |
| 100 | 3 | About 300 loops over elements |
| 1000 | 4 | About 4000 loops over elements |
Pattern observation: Operations grow roughly by n times d, so doubling n or d roughly doubles work.
Time Complexity: O(d * n)
This means the time grows linearly with the number of elements and the number of digits in the largest number.
[X] Wrong: "Radix Sort always runs in linear time O(n)."
[OK] Correct: The number of digits (d) matters; if numbers are very large, d grows and so does time.
Understanding Radix Sort's time complexity helps you explain when it is efficient and when it might not be.
This skill shows you can analyze algorithms beyond simple loops, a key part of problem solving.
"What if we changed the base from 10 to 2 (binary digits)? How would the time complexity change?"