Activation records and call stack in Compiler Design - Time & Space Complexity
When functions call other functions, the computer uses activation records to keep track of each call.
We want to understand how the number of these records grows as more function calls happen.
Analyze the time complexity of the following recursive function using activation records.
function factorial(n) {
if (n == 0) return 1;
else return n * factorial(n - 1);
}
This function calls itself repeatedly, creating an activation record for each call until it reaches the base case.
Look at the recursive calls that repeat.
- Primary operation: Each call to
factorialcreates a new activation record. - How many times: Exactly
n + 1times for inputn(fromndown to 0).
Each increase in n adds one more function call and activation record.
| Input Size (n) | Approx. Operations (Activation Records) |
|---|---|
| 10 | 11 |
| 100 | 101 |
| 1000 | 1001 |
Pattern observation: The number of activation records grows directly with n.
Time Complexity: O(n)
This means the number of function calls and activation records grows linearly as the input number increases.
[X] Wrong: "The recursive calls happen all at once, so the time is constant regardless of input size."
[OK] Correct: Each call waits for the next one to finish, so calls stack up one after another, increasing with input size.
Understanding how activation records grow helps you explain recursion depth and stack usage clearly, a key skill in many programming discussions.
What if the function called itself twice in each call (like in Fibonacci)? How would the time complexity and activation records grow?