Coin Change Total Number of Ways in DSA C - Time & Space Complexity
We want to know how the time needed to find all ways to make change grows as we add more coins or increase the amount.
How does the work increase when the input size changes?
Analyze the time complexity of the following code snippet.
int countWays(int coins[], int n, int amount) {
int dp[amount + 1];
for (int i = 0; i <= amount; i++) dp[i] = 0;
dp[0] = 1;
for (int i = 0; i < n; i++) {
for (int j = coins[i]; j <= amount; j++) {
dp[j] += dp[j - coins[i]];
}
}
return dp[amount];
}
This code counts how many ways we can make the target amount using the given coins.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Nested loops where the outer loop runs over each coin and the inner loop runs over amounts from coin value to target amount.
- How many times: Outer loop runs n times (number of coins), inner loop runs up to amount times.
As we add more coins or increase the amount, the work grows by multiplying these two factors.
| Input Size (coins n, amount) | Approx. Operations |
|---|---|
| n=5, amount=10 | ~50 operations |
| n=10, amount=100 | ~1,000 operations |
| n=20, amount=1000 | ~20,000 operations |
Pattern observation: The total work grows roughly by multiplying the number of coins and the amount.
Time Complexity: O(n * amount)
This means the time needed grows proportionally with both the number of coins and the target amount.
[X] Wrong: "The time depends only on the amount, not the number of coins."
[OK] Correct: Each coin adds a full loop over the amounts, so more coins mean more work.
Understanding this time complexity helps you explain how dynamic programming solves problems efficiently by breaking them into smaller parts.
"What if we used recursion with memoization instead of this bottom-up approach? How would the time complexity change?"