Aggregate functions (Count, Sum, Average) in C Sharp (C#) - Time & Space Complexity
When using aggregate functions like Count, Sum, and Average, it is important to understand how the time to compute them grows as the data size increases.
We want to know how the number of operations changes when we have more items to process.
Analyze the time complexity of the following code snippet.
int[] numbers = {1, 2, 3, 4, 5};
int count = numbers.Length;
int sum = 0;
for (int i = 0; i < numbers.Length; i++)
{
sum += numbers[i];
}
double average = (double)sum / count;
This code counts the number of items, sums all values, and calculates the average.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The for-loop that adds each number to the sum.
- How many times: It runs once for each item in the array.
As the number of items grows, the time to sum them grows in a straight line.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 additions |
| 100 | 100 additions |
| 1000 | 1000 additions |
Pattern observation: The operations increase directly with the number of items.
Time Complexity: O(n)
This means the time to compute Count, Sum, and Average grows linearly with the number of items.
[X] Wrong: "Calculating Sum or Average is instant no matter how many items there are."
[OK] Correct: Each item must be looked at once to add it, so more items mean more work.
Understanding how aggregate functions scale helps you explain performance clearly and shows you know how data size affects your code.
"What if we used multiple loops to calculate Sum and Count separately? How would the time complexity change?"