Rod Cutting Problem in DSA C - Time & Space Complexity
We want to understand how the time needed to solve the rod cutting problem changes as the rod length grows.
How does the number of steps grow when the rod gets longer?
Analyze the time complexity of the following code snippet.
int rodCutting(int price[], int n) {
if (n == 0) return 0;
int max_val = -1;
for (int i = 1; i <= n; i++) {
int val = price[i - 1] + rodCutting(price, n - i);
if (val > max_val) max_val = val;
}
return max_val;
}
This code tries all ways to cut the rod and picks the best price sum.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The for-loop tries all cut lengths from 1 to n.
- How many times: For each call with length n, it calls itself recursively for smaller lengths, repeating this many times.
Each rod length n calls itself for all smaller lengths, creating many repeated calls.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | Thousands of calls |
| 20 | More than a million calls |
| 30 | Billions of calls |
Pattern observation: The number of calls grows very fast, much faster than just doubling.
Time Complexity: O(2^n)
This means the time needed doubles roughly with each increase in rod length, making it very slow for large rods.
[X] Wrong: "The code runs in linear time because it just loops from 1 to n."
[OK] Correct: Each loop iteration calls the function again, causing many repeated calls that multiply quickly.
Understanding this helps you explain why naive recursion can be slow and why dynamic programming is better.
"What if we store results of smaller rods to avoid repeated calls? How would the time complexity change?"