Unique Paths in Grid DP in DSA C - Time & Space Complexity
We want to know how the time needed to find unique paths in a grid changes as the grid size grows.
How does the number of steps grow when the grid gets bigger?
Analyze the time complexity of the following code snippet.
int uniquePaths(int m, int n) {
int dp[m][n];
for (int i = 0; i < m; i++) dp[i][0] = 1;
for (int j = 0; j < n; j++) dp[0][j] = 1;
for (int i = 1; i < m; i++) {
for (int j = 1; j < n; j++) {
dp[i][j] = dp[i-1][j] + dp[i][j-1];
}
}
return dp[m-1][n-1];
}
This code calculates the number of unique paths from the top-left to the bottom-right of an m by n grid using dynamic programming.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The nested loops that fill the dp table.
- How many times: The inner loop runs (m-1) * (n-1) times, covering almost all cells except the first row and column.
As the grid size grows, the number of operations grows roughly with the number of cells in the grid.
| Input Size (m x n) | Approx. Operations |
|---|---|
| 10 x 10 | ~81 (9*9) |
| 100 x 100 | ~9,801 (99*99) |
| 1000 x 1000 | ~998,001 (999*999) |
Pattern observation: The operations grow roughly by the product of m and n, so doubling both dimensions roughly quadruples the work.
Time Complexity: O(m * n)
This means the time needed grows proportionally to the total number of cells in the grid.
[X] Wrong: "The time grows exponentially because there are many paths."
[OK] Correct: The dynamic programming approach avoids repeated work by storing results, so it only visits each cell once, making the time grow linearly with the grid size.
Understanding this time complexity helps you explain how dynamic programming improves efficiency by avoiding repeated calculations, a key skill in many coding challenges.
"What if we used recursion without memoization? How would the time complexity change?"