Coin Change Minimum Coins in DSA C - Time & Space Complexity
We want to know how the time needed to find the minimum coins changes as the amount grows.
How does the number of coins and the amount affect the work done?
Analyze the time complexity of the following code snippet.
int minCoins(int coins[], int n, int amount) {
int dp[amount + 1];
dp[0] = 0;
for (int i = 1; i <= amount; i++) {
dp[i] = INT_MAX;
for (int j = 0; j < n; j++) {
if (coins[j] <= i && dp[i - coins[j]] != INT_MAX) {
dp[i] = dp[i] < dp[i - coins[j]] + 1 ? dp[i] : dp[i - coins[j]] + 1;
}
}
}
return dp[amount];
}
This code finds the minimum number of coins needed to make the given amount using the coin denominations.
Look at the loops that repeat work:
- Primary operation: The inner loop checks all coin types for each amount value.
- How many times: The outer loop runs once for each amount from 1 to amount (amount times), and the inner loop runs for each coin type (n times).
As the amount grows, the work grows by checking all coins for each amount value.
| Input Size (amount) | Approx. Operations |
|---|---|
| 10 | 10 * number_of_coins |
| 100 | 100 * number_of_coins |
| 1000 | 1000 * number_of_coins |
Pattern observation: The operations grow roughly in a straight line with the amount, multiplied by the number of coin types.
Time Complexity: O(amount * n)
This means the time grows proportionally to the amount times the number of coin types.
[X] Wrong: "The time depends only on the amount, not the number of coins."
[OK] Correct: Each coin type is checked for every amount, so more coin types mean more work.
Understanding this helps you explain how dynamic programming solves problems efficiently by breaking them into smaller parts.
"What if we used recursion without memoization? How would the time complexity change?"