Sorting Stability and When to Use Which Sort in DSA C++ - Time & Space Complexity
Sorting algorithms arrange data in order. Knowing their time cost helps pick the right one.
We ask: How does sorting time grow as data size grows, and when is stability important?
Analyze the time complexity of this simple insertion sort code.
void insertionSort(int arr[], int n) {
for (int i = 1; i < n; i++) {
int key = arr[i];
int j = i - 1;
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j];
j--;
}
arr[j + 1] = key;
}
}
This code sorts an array by inserting each element into its correct place, preserving order of equal elements (stable).
Look for loops and repeated steps.
- Primary operation: The inner while loop shifts elements to insert the key.
- How many times: The outer for loop runs n-1 times; inner loop runs up to i times per outer loop.
As array size grows, the number of shifts grows roughly with the square of n in worst case.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 45 shifts |
| 100 | About 4,950 shifts |
| 1000 | About 499,500 shifts |
Pattern observation: Operations grow roughly by the square of input size, so doubling n quadruples work.
Time Complexity: O(n²)
This means sorting time grows quickly as data grows, especially if data is in reverse order.
[X] Wrong: "All sorting algorithms take the same time regardless of data order."
[OK] Correct: Some sorts like insertion sort are faster on nearly sorted data, while others like quicksort have different behaviors.
Understanding sorting time helps you choose the right tool in coding challenges and real projects, showing you know both theory and practice.
What if we changed insertion sort to quicksort? How would the time complexity and stability change?