Next Permutation of Array in DSA C - Time & Space Complexity
We want to understand how the time needed to find the next permutation changes as the array size grows.
Specifically, how does the work increase when the array gets longer?
Analyze the time complexity of the following code snippet.
void nextPermutation(int* nums, int numsSize) {
int i = numsSize - 2;
while (i >= 0 && nums[i] >= nums[i + 1]) {
i--;
}
if (i >= 0) {
int j = numsSize - 1;
while (nums[j] <= nums[i]) {
j--;
}
int temp = nums[i];
nums[i] = nums[j];
nums[j] = temp;
}
int left = i + 1, right = numsSize - 1;
while (left < right) {
int temp = nums[left];
nums[left++] = nums[right];
nums[right--] = temp;
}
}
This code finds the next lexicographically greater permutation of the array or rearranges it to the smallest if none exists.
Look for loops and repeated steps.
- Primary operation: Three while loops that scan parts of the array.
- How many times: Each loop can run up to n times, where n is the array length.
As the array size grows, the loops may scan more elements.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | Up to about 30 steps |
| 100 | Up to about 300 steps |
| 1000 | Up to about 3000 steps |
Pattern observation: The total steps grow roughly in a straight line with the array size.
Time Complexity: O(n)
This means the time to find the next permutation grows linearly with the array length.
[X] Wrong: "The algorithm always checks every pair of elements, so it must be O(n²)."
[OK] Correct: Each loop scans only once and does not nest inside others, so total work is proportional to n, not n squared.
Understanding this linear time solution shows you can handle array scanning and swapping efficiently, a skill useful in many coding problems.
"What if the array was sorted in descending order? How would the time complexity behave then?"
