GCD and LCM Euclidean Algorithm in DSA Python - Time & Space Complexity
We want to understand how fast the Euclidean algorithm finds the greatest common divisor (GCD) of two numbers.
How does the number of steps change as the input numbers get bigger?
Analyze the time complexity of the following code snippet.
def gcd(a: int, b: int) -> int:
while b != 0:
a, b = b, a % b
return a
# LCM can be found using GCD:
def lcm(a: int, b: int) -> int:
return a * b // gcd(a, b)
This code finds the GCD of two numbers using the Euclidean algorithm and then calculates the LCM using the GCD.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The while loop repeatedly replaces the pair (a, b) with (b, a % b).
- How many times: The loop runs until b becomes zero, which depends on how quickly the remainder shrinks.
Each step reduces the size of the numbers roughly by at least one digit, so the number of steps grows slowly as numbers get bigger.
| Input Size (digits) | Approx. Steps |
|---|---|
| 2 (e.g., 99) | 5-7 |
| 4 (e.g., 9999) | 15-20 |
| 6 (e.g., 999999) | 30-40 |
Pattern observation: The steps grow roughly proportional to the number of digits, not the number size itself.
Time Complexity: O(log(min(a, b)))
This means the number of steps grows roughly with the number of digits in the smaller input number.
[X] Wrong: "The Euclidean algorithm takes time proportional to the size of the numbers themselves (like a or b)."
[OK] Correct: The algorithm works on remainders, which shrink quickly, so the steps depend on the number of digits (logarithm), not the full number size.
Understanding this algorithm's time complexity shows you can analyze efficient number-based algorithms, a useful skill for many coding problems.
"What if we used subtraction instead of modulo in the GCD calculation? How would the time complexity change?"