Fast Exponentiation Power in Log N in DSA C - Time & Space Complexity
We want to know how fast the fast exponentiation method calculates powers compared to simple repeated multiplication.
How does the number of steps grow when the power gets bigger?
Analyze the time complexity of the following code snippet.
int fastPower(int base, int exp) {
if (exp == 0) return 1;
int half = fastPower(base, exp / 2);
if (exp % 2 == 0) return half * half;
else return base * half * half;
}
This code calculates base raised to the power exp using a method that splits the problem in half each time.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Recursive calls that halve the exponent each time.
- How many times: About log base 2 of exp times, because the exponent is divided by 2 each call.
Each time we call the function, the exponent gets cut in half, so the number of calls grows slowly as the exponent grows.
| Input Size (exp) | Approx. Operations (recursive calls) |
|---|---|
| 10 | About 4 |
| 100 | About 7 |
| 1000 | About 10 |
Pattern observation: Doubling the exponent adds only one more step, so growth is very slow.
Time Complexity: O(log n)
This means the number of steps grows slowly, only a little more work for much bigger powers.
[X] Wrong: "The function does one multiplication per power, so it must be O(n)."
[OK] Correct: The function splits the problem in half each time, so it does far fewer steps than multiplying n times.
Understanding this method shows you can improve simple tasks by clever thinking, a skill valued in real coding challenges.
"What if we changed the code to use a loop instead of recursion? How would the time complexity change?"
