Call stack behavior in C++ - Time & Space Complexity
When a program runs functions, it uses a call stack to keep track of where it is. Understanding how this stack grows helps us see how time grows with input.
We want to know how the number of function calls affects the total work done.
Analyze the time complexity of the following recursive function.
int factorial(int n) {
if (n <= 1) return 1;
return n * factorial(n - 1);
}
This function calculates the factorial of a number by calling itself with smaller values until it reaches 1.
Look at what repeats as the function calls itself:
- Primary operation: The recursive call to
factorial(n - 1). - How many times: It repeats once for each number from
ndown to 1, sontimes.
As n grows, the number of calls grows the same way.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 calls |
| 100 | About 100 calls |
| 1000 | About 1000 calls |
Pattern observation: The work grows directly with n. Double n, double the calls.
Time Complexity: O(n)
This means the time to finish grows in a straight line with the input size.
[X] Wrong: "Recursive calls happen all at once, so time is constant."
[OK] Correct: Each call waits for the next one to finish, so calls add up one after another, making time grow with n.
Understanding how the call stack grows helps you explain recursion clearly and shows you can think about how programs use memory and time together.
What if the function called itself twice each time (like in Fibonacci)? How would the time complexity change?