Recursion Concept and Call Stack Visualization in DSA C - Time & Space Complexity
We want to understand how the time taken by a recursive function grows as the input size increases.
Specifically, how many times the function calls itself and what that means for performance.
Analyze the time complexity of the following code snippet.
int factorial(int n) {
if (n == 0) {
return 1;
}
return n * factorial(n - 1);
}
This code calculates the factorial of a number using recursion by calling itself with a smaller number until it reaches zero.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The function calls itself once per call, reducing n by 1 each time.
- How many times: It repeats exactly n + 1 times until it reaches the base case (n == 0).
Each increase in n adds one more function call, so the total calls grow directly with n.
| Input Size (n) | Approx. Operations (Function Calls) |
|---|---|
| 10 | 11 calls |
| 100 | 101 calls |
| 1000 | 1001 calls |
Pattern observation: The number of calls grows linearly as n increases.
Time Complexity: O(n)
This means the time taken grows in a straight line with the input size; doubling n roughly doubles the work.
[X] Wrong: "Recursion always makes the program slower exponentially because it calls itself many times."
[OK] Correct: In this case, each call happens once per input size, so the growth is linear, not exponential.
Understanding how recursion grows helps you explain your code clearly and reason about performance in interviews.
"What if the recursive function called itself twice each time? How would the time complexity change?"