Two Pointer Technique on Arrays in DSA Python - Time & Space Complexity
We want to understand how fast the two pointer technique works when scanning arrays.
How does the number of steps grow as the array gets bigger?
Analyze the time complexity of the following code snippet.
def two_pointer_example(arr, target):
left = 0
right = len(arr) - 1
while left < right:
current_sum = arr[left] + arr[right]
if current_sum == target:
return True
elif current_sum < target:
left += 1
else:
right -= 1
return False
This code checks if any two numbers in a sorted array add up to a target value using two pointers.
- Primary operation: The while loop moves the left and right pointers through the array.
- How many times: Each pointer moves at most from one end to the other, so together they move at most n steps.
As the array size grows, the total steps grow roughly in a straight line with the number of elements.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 steps |
| 100 | About 100 steps |
| 1000 | About 1000 steps |
Pattern observation: The steps increase directly with input size, not faster or slower.
Time Complexity: O(n)
This means the time to run grows in a straight line with the size of the array.
[X] Wrong: "The two pointer method checks every pair, so it must be O(n²)."
[OK] Correct: The pointers move inward without repeating pairs, so total steps are only about n, not n squared.
Understanding this technique shows you can solve problems efficiently by scanning arrays smartly, a skill many interviews value.
"What if the array was not sorted? How would the time complexity change if we still tried this two pointer approach?"