Why Greedy Works and When It Fails in DSA C - Performance Analysis
We want to understand how fast greedy algorithms run and why they sometimes give the best answer.
We ask: How does the work grow as the input gets bigger, and when does greedy fail?
Analyze the time complexity of this simple greedy approach for coin change.
int greedyCoinChange(int coins[], int n, int amount) {
int count = 0;
for (int i = n - 1; i >= 0; i--) {
while (amount >= coins[i]) {
amount -= coins[i];
count++;
}
}
return count;
}
This code picks the largest coin repeatedly until the amount is covered or no coins left.
Look for loops and repeated steps.
- Primary operation: The inner while loop subtracting coins from amount.
- How many times: Up to amount divided by smallest coin value times.
As amount grows, the number of subtractions grows roughly in proportion.
| Input Size (amount) | Approx. Operations |
|---|---|
| 10 | About 10 subtractions |
| 100 | About 100 subtractions |
| 1000 | About 1000 subtractions |
Pattern observation: The work grows roughly linearly with the amount.
Time Complexity: O(n * (amount / smallest_coin))
This means the time depends on the number of coin types and how many times the smallest coin fits into the amount.
[X] Wrong: "Greedy always finds the best answer quickly."
[OK] Correct: Sometimes greedy picks local best choices that don't lead to the overall best solution, especially with tricky coin sets.
Understanding when greedy works and its time cost shows you can choose the right tool for the problem, a key skill in interviews.
"What if we sorted coins in descending order and tried greedy? How would the time complexity and correctness change?"