Why Dynamic Programming and When Recursion Alone Fails in DSA C - Performance Analysis
We want to understand why simple recursion can be slow for some problems and how dynamic programming helps speed things up.
What happens to the work done when recursion repeats the same steps many times?
Analyze the time complexity of the following recursive Fibonacci function.
int fib(int n) {
if (n <= 1) return n;
return fib(n - 1) + fib(n - 2);
}
This code calculates the nth Fibonacci number using simple recursion without saving results.
Look at what repeats in this code.
- Primary operation: Calling fib() recursively twice for each n greater than 1.
- How many times: Many calls repeat the same fib values again and again, especially for smaller n.
As n grows, the number of calls grows very fast because each call makes two more calls.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | ~177 calls |
| 20 | ~21,891 calls |
| 30 | ~2,692,537 calls |
Pattern observation: The work roughly doubles with each increase in n, leading to very fast growth.
Time Complexity: O(2^n)
This means the time needed grows exponentially, making recursion alone very slow for larger inputs.
[X] Wrong: "Recursion always solves problems efficiently because it breaks problems into smaller parts."
[OK] Correct: Without saving results, recursion can repeat the same work many times, causing slow performance.
Understanding when recursion alone is slow and how dynamic programming improves it is a key skill that shows you can write efficient code for complex problems.
"What if we add a memory table to save results of fib calls? How would the time complexity change?"