Fast Exponentiation Power in Log N in DSA Python - Time & Space Complexity
We want to understand how fast the power calculation grows as the exponent gets bigger.
How does the number of steps change when we calculate a number raised to a large power?
Analyze the time complexity of the following code snippet.
def fast_power(base: int, exponent: int) -> int:
if exponent == 0:
return 1
half = fast_power(base, exponent // 2)
if exponent % 2 == 0:
return half * half
else:
return half * half * base
This code calculates base raised to the exponent using a method that splits the problem in half each time.
- Primary operation: Recursive calls that halve the exponent each time.
- How many times: About log base 2 of the exponent times.
Each step cuts the exponent roughly in half, so the number of steps grows slowly as the exponent grows.
| Input Size (exponent) | Approx. Operations (recursive calls) |
|---|---|
| 10 | About 4 |
| 100 | About 7 |
| 1000 | About 10 |
Pattern observation: Doubling the exponent only adds one more step.
Time Complexity: O(log n)
This means the number of steps grows slowly, only increasing by one each time the exponent doubles.
[X] Wrong: "Calculating power always takes as many steps as the exponent value."
[OK] Correct: This method reduces the problem size by half each time, so it takes far fewer steps than the exponent itself.
Understanding this method shows you can improve simple calculations by breaking problems down smartly, a skill valued in many coding challenges.
"What if we changed the method to multiply the base exponent times without halving? How would the time complexity change?"