Recursive functions in R Programming - Time & Space Complexity
When we use recursive functions, the program calls itself repeatedly. We want to know how the time it takes grows as the input gets bigger.
How many times does the function call itself and how does that affect the total work done?
Analyze the time complexity of the following code snippet.
factorial <- function(n) {
if (n <= 1) {
return(1)
} else {
return(n * factorial(n - 1))
}
}
This function calculates the factorial of a number by calling itself with smaller numbers until it reaches 1.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The function calls itself once per step, reducing n by 1 each time.
- How many times: It repeats this call n times until it reaches the base case.
Each time we increase n by 1, the function does one more call. So the work grows steadily with n.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 calls |
| 100 | About 100 calls |
| 1000 | About 1000 calls |
Pattern observation: The number of calls grows directly with n, so doubling n doubles the work.
Time Complexity: O(n)
This means the time taken grows in a straight line with the size of the input number.
[X] Wrong: "Recursive functions always take more time than loops because they call themselves multiple times."
[OK] Correct: Some recursive functions call themselves only once per step, just like a loop. So their time grows linearly, not more.
Understanding how recursion affects time helps you explain your code clearly and choose the best approach. It shows you can think about how programs grow with input size.
"What if the recursive function called itself twice each time, like in a Fibonacci calculation? How would the time complexity change?"