Why operators drive computation in R Programming - Performance Analysis
We want to see how the number of operations affects how long a program takes to run.
Specifically, we ask: which parts of the code cause the most work as input grows?
Analyze the time complexity of the following code snippet.
# Sum all elements in a numeric vector
sum_vector <- function(vec) {
total <- 0
for (i in seq_along(vec)) {
total <- total + vec[i]
}
return(total)
}
This code adds up every number in a vector one by one.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Addition operator inside the loop.
- How many times: Once for each element in the input vector.
Each new number means one more addition to do.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 additions |
| 100 | 100 additions |
| 1000 | 1000 additions |
Pattern observation: The work grows directly with the number of items.
Time Complexity: O(n)
This means the time to finish grows in a straight line as the input gets bigger.
[X] Wrong: "The addition operator runs only once no matter the input size."
[OK] Correct: Actually, the addition happens inside a loop, so it repeats for every element, making the total work grow with input size.
Understanding which operations repeat helps you explain how your code scales and shows you know what makes programs slower as data grows.
"What if we replaced the addition operator with a more complex operation inside the loop? How would the time complexity change?"