Quick Sort Algorithm in DSA Go - Time & Space Complexity
We want to understand how the time to sort a list using Quick Sort changes as the list grows.
How does the number of steps grow when the list gets bigger?
Analyze the time complexity of the following code snippet.
func quickSort(arr []int) []int {
if len(arr) <= 1 {
return arr
}
pivot := arr[len(arr)/2]
left := []int{}
right := []int{}
middle := []int{}
for _, v := range arr {
if v < pivot {
left = append(left, v)
} else if v > pivot {
right = append(right, v)
} else {
middle = append(middle, v)
}
}
left = quickSort(left)
right = quickSort(right)
return append(append(left, middle...), right...)
}
This code sorts an array by picking a pivot, splitting the array into parts, and sorting each part recursively.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Looping through the array to split it around the pivot.
- How many times: This splitting happens at each recursive call, dividing the array into smaller parts.
Each time we split, we do work proportional to the current size, and then repeat on smaller parts.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 30 to 40 operations |
| 100 | About 700 to 800 operations |
| 1000 | About 10,000 to 15,000 operations |
Pattern observation: The operations grow a bit faster than the input size but not as fast as the square of the input.
Time Complexity: O(n log n)
This means the time to sort grows a little more than linearly, multiplying the input size by its logarithm.
[X] Wrong: "Quick Sort always takes the same time regardless of input order."
[OK] Correct: If the pivot choices are poor, Quick Sort can take much longer, up to O(n^2) in the worst case.
Understanding Quick Sort's time complexity helps you explain why it is fast on average and what cases slow it down, showing your grasp of sorting algorithms.
"What if we always picked the first element as pivot instead of the middle? How would the time complexity change?"