$unwind for flattening arrays in MongoDB - Time & Space Complexity
When using $unwind in MongoDB, we want to understand how the work grows as the array size grows.
We ask: How does flattening arrays affect the number of operations?
Analyze the time complexity of the following MongoDB aggregation using $unwind.
db.orders.aggregate([
{ $match: { status: "active" } },
{ $unwind: "$items" },
{ $group: { _id: "$items.productId", totalQty: { $sum: "$items.qty" } } }
])
This pipeline filters active orders, then flattens the items array so each item is a separate document, and finally groups by product ID.
Look for repeated work inside the pipeline.
- Primary operation:
$unwindrepeats work for each element in theitemsarray. - How many times: Once per array element in each matched document.
As the number of array elements grows, the work grows proportionally.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 items per document | About 10 operations per document |
| 100 items per document | About 100 operations per document |
| 1000 items per document | About 1000 operations per document |
Pattern observation: The number of operations grows linearly with the number of array elements.
Time Complexity: O(n)
This means the time to unwind grows directly with the number of elements in the arrays being flattened.
[X] Wrong: "$unwind runs once per document regardless of array size."
[OK] Correct: Actually, $unwind processes each element inside the array separately, so more elements mean more work.
Understanding how $unwind scales helps you explain data processing costs clearly and shows you know how MongoDB handles arrays internally.
"What if we replaced $unwind with $project to keep arrays as is? How would the time complexity change?"