Remove Duplicates from Sorted Array Two Pointer in DSA C - Time & Space Complexity
We want to understand how the time needed to remove duplicates from a sorted array grows as the array gets bigger.
Specifically, how many steps does the two-pointer method take as the input size increases?
Analyze the time complexity of the following code snippet.
int removeDuplicates(int* nums, int numsSize) {
if (numsSize == 0) return 0;
int i = 0;
for (int j = 1; j < numsSize; j++) {
if (nums[j] != nums[i]) {
i++;
nums[i] = nums[j];
}
}
return i + 1;
}
This code removes duplicates from a sorted array by using two pointers to overwrite duplicates in place.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The for-loop that moves pointer
jfrom start to end of the array. - How many times: Exactly once for each element after the first, so
numsSize - 1times.
As the array size grows, the loop runs once for each element, so the steps increase directly with input size.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 9 checks and possible assignments |
| 100 | About 99 checks and possible assignments |
| 1000 | About 999 checks and possible assignments |
Pattern observation: The number of operations grows linearly as the input size increases.
Time Complexity: O(n)
This means the time to remove duplicates grows in direct proportion to the number of elements in the array.
[X] Wrong: "Because we skip duplicates, the time complexity is less than linear."
[OK] Correct: Even if duplicates are skipped, the loop still checks every element once, so the time grows linearly with input size.
Understanding this linear time solution shows you can efficiently process sorted data with simple pointers, a skill often valued in coding challenges.
"What if the array was not sorted? How would the time complexity change if we still tried to remove duplicates in place?"
