$sum accumulator in MongoDB - Time & Space Complexity
When using the $sum accumulator in MongoDB, it's important to understand how the time to calculate the sum grows as the data size increases.
We want to know: how does the work change when we add more documents to sum?
Analyze the time complexity of the following MongoDB aggregation snippet.
db.sales.aggregate([
{ $group: { _id: "$store", totalSales: { $sum: "$amount" } } }
])
This groups sales by store and sums the amount for each store.
Look for repeated actions that take time.
- Primary operation: Summing the
amountfield for each document in the collection. - How many times: Once for each document in the input data.
As the number of documents grows, the sum operation must add more numbers.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 additions |
| 100 | 100 additions |
| 1000 | 1000 additions |
Pattern observation: The work grows directly with the number of documents; double the documents means double the additions.
Time Complexity: O(n)
This means the time to compute the sum grows in a straight line with the number of documents.
[X] Wrong: "The $sum accumulator runs instantly no matter how many documents there are."
[OK] Correct: Each document's value must be added, so more documents mean more work and more time.
Understanding how aggregation operations like $sum scale helps you explain performance in real projects and shows you can think about data size impact clearly.
What if we changed the aggregation to group by two fields instead of one? How would the time complexity change?