Address and dereference operators - Time & Space Complexity
We want to see how using address and dereference operators affects how long a program takes to run.
Specifically, we ask: does accessing memory with these operators slow down the program as input grows?
Analyze the time complexity of the following code snippet.
int sumArray(int *arr, int n) {
int sum = 0;
for (int i = 0; i < n; i++) {
sum += *(arr + i); // dereference operator
}
return sum;
}
This code sums all elements in an integer array using the address and dereference operators.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Accessing each array element by dereferencing a pointer.
- How many times: Exactly once for each element, so n times.
As the array size grows, the program does more work by accessing more elements.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 dereferences and additions |
| 100 | 100 dereferences and additions |
| 1000 | 1000 dereferences and additions |
Pattern observation: The work grows directly with the number of elements; doubling n doubles the work.
Time Complexity: O(n)
This means the time to run grows in a straight line with the size of the input array.
[X] Wrong: "Using the dereference operator makes the code slower by a lot compared to normal array access."
[OK] Correct: Dereferencing a pointer is essentially how array access works under the hood, so it does not add extra time complexity.
Understanding how pointer access scales helps you explain memory operations clearly and shows you grasp how programs handle data step-by-step.
"What if we replaced the for-loop with recursion to sum the array? How would the time complexity change?"