0 1 Knapsack Problem in DSA C - Time & Space Complexity
We want to understand how the time needed to solve the 0 1 Knapsack Problem changes as the number of items and the knapsack capacity grow.
How does the algorithm's work increase when we add more items or increase the capacity?
Analyze the time complexity of the following dynamic programming solution.
int max(int a, int b) {
return (a > b) ? a : b;
}
int knapsack(int W, int wt[], int val[], int n) {
int dp[n+1][W+1];
for (int i = 0; i <= n; i++) {
for (int w = 0; w <= W; w++) {
if (i == 0 || w == 0)
dp[i][w] = 0;
else if (wt[i-1] <= w)
dp[i][w] = max(val[i-1] + dp[i-1][w - wt[i-1]], dp[i-1][w]);
else
dp[i][w] = dp[i-1][w];
}
}
return dp[n][W];
}
This code fills a table to find the best value achievable with given items and capacity.
Look at the loops that repeat work:
- Primary operation: The nested loops over items (n) and capacity (W).
- How many times: The outer loop runs n+1 times, and the inner loop runs W+1 times for each outer loop.
As we increase the number of items or the capacity, the work grows by multiplying these two.
| Input Size (n, W) | Approx. Operations |
|---|---|
| 10 items, capacity 10 | ~100 operations |
| 100 items, capacity 100 | ~10,000 operations |
| 1000 items, capacity 1000 | ~1,000,000 operations |
Pattern observation: Doubling items and capacity roughly squares the work needed.
Time Complexity: O(n * W)
This means the time grows proportionally to the number of items times the capacity size.
[X] Wrong: "The time depends only on the number of items, not the capacity."
[OK] Correct: The capacity size directly affects how many subproblems we solve, so it impacts the total work.
Understanding this complexity helps you explain how dynamic programming solves problems efficiently by breaking them into smaller parts.
"What if we used a recursive solution without memoization? How would the time complexity change?"