Generate All Permutations of Array in DSA C - Time & Space Complexity
When we generate all permutations of an array, we want to know how the time needed grows as the array gets bigger.
We ask: How many steps does it take to list every possible order of the array elements?
Analyze the time complexity of the following code snippet.
void permute(int *arr, int start, int end) {
if (start == end) {
// print or store the permutation
return;
}
for (int i = start; i <= end; i++) {
swap(&arr[start], &arr[i]);
permute(arr, start + 1, end);
swap(&arr[start], &arr[i]);
}
}
This code generates all possible orders (permutations) of the array elements by swapping and recursively exploring each choice.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Recursive calls combined with a loop that swaps elements.
- How many times: The function calls itself for every possible arrangement, which grows very fast as the array size increases.
Each time we add one more element, the number of possible orders multiplies by that new number.
| Input Size (n) | Approx. Operations |
|---|---|
| 3 | 6 (3 x 2 x 1) |
| 5 | 120 (5 x 4 x 3 x 2 x 1) |
| 7 | 5040 (7 x 6 x 5 x 4 x 3 x 2 x 1) |
Pattern observation: The number of operations grows by multiplying all numbers from 1 up to n, which is very fast growth.
Time Complexity: O(n!)
This means the time needed grows very fast, multiplying by a bigger number each time we add an element.
[X] Wrong: "The time grows like n squared or n cubed because of the loops."
[OK] Correct: The loops are inside recursive calls that multiply the work many times, so the growth is much faster than simple loops.
Understanding this growth helps you explain why generating all permutations is expensive and when it's practical to do so.
"What if we only wanted permutations of length k from n elements? How would the time complexity change?"