Arithmetic operators in Go - Time & Space Complexity
We want to see how the time to run arithmetic operations changes as we do more of them.
How does the number of calculations grow when we increase the input size?
Analyze the time complexity of the following code snippet.
func sumArray(nums []int) int {
total := 0
for _, num := range nums {
total += num
}
return total
}
This code adds up all numbers in a list using arithmetic operators inside a loop.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Addition inside the loop (total += num)
- How many times: Once for each number in the list
Each number in the list causes one addition operation.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 additions |
| 100 | 100 additions |
| 1000 | 1000 additions |
Pattern observation: The number of operations grows directly with the number of items.
Time Complexity: O(n)
This means the time to finish grows in a straight line as the list gets bigger.
[X] Wrong: "Arithmetic operations inside a loop are instant and don't affect time."
[OK] Correct: Even simple additions take time, and doing them many times adds up as the list grows.
Understanding how loops with arithmetic operations scale helps you explain efficiency clearly and confidently.
"What if we replaced the loop with nested loops doing additions? How would the time complexity change?"