Arithmetic operators - Time & Space Complexity
We want to see how the time to run arithmetic operations changes as we do more of them.
How does the number of calculations affect the total time?
Analyze the time complexity of the following code snippet.
int sum = 0;
for (int i = 0; i < n; i++) {
sum = sum + i;
}
return sum;
This code adds numbers from 0 up to n-1 using a loop.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Addition inside the loop
- How many times: Exactly n times, once per loop cycle
As n grows, the number of additions grows the same way.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 additions |
| 100 | 100 additions |
| 1000 | 1000 additions |
Pattern observation: The work grows directly with n, so doubling n doubles the work.
Time Complexity: O(n)
This means the time to finish grows in a straight line as the input size grows.
[X] Wrong: "Arithmetic operations inside a loop take constant time no matter how many times they run."
[OK] Correct: Each operation takes a small fixed time, but doing many operations adds up, so total time grows with the number of operations.
Understanding how simple arithmetic inside loops affects time helps you explain how programs scale and run efficiently.
"What if we replaced the addition with a multiplication inside the loop? How would the time complexity change?"