Iterating over arrays in Go - Time & Space Complexity
When we go through each item in an array one by one, it takes some time. We want to understand how this time grows as the array gets bigger.
The question is: How does the work change when the array size changes?
Analyze the time complexity of the following code snippet.
package main
func sumArray(arr []int) int {
total := 0
for _, value := range arr {
total += value
}
return total
}
This code adds up all numbers in an array by visiting each element once.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Adding each element to total inside the loop.
- How many times: Once for every element in the array.
As the array gets bigger, the number of additions grows at the same pace.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 additions |
| 100 | 100 additions |
| 1000 | 1000 additions |
Pattern observation: The work grows directly with the number of items. Double the items, double the work.
Time Complexity: O(n)
This means the time to finish grows in a straight line with the number of elements.
[X] Wrong: "Since the loop is simple, it must be constant time."
[OK] Correct: Even a simple loop does more work as the array grows. The time depends on how many elements there are.
Understanding how loops over arrays scale is a key skill. It helps you explain your code clearly and shows you know how programs behave with bigger data.
"What if we nested another loop inside to compare each element with every other? How would the time complexity change?"