Method invocation flow in Python - Time & Space Complexity
When we call methods in a program, each call takes some time to run. Understanding how the total time grows when methods call other methods helps us write faster code.
We want to know: how does the time to finish change as the program runs more method calls?
Analyze the time complexity of the following code snippet.
class Calculator:
def multiply(self, x, y):
return x * y
def square(self, n):
return self.multiply(n, n)
result = Calculator().square(5)
print(result)
This code defines two methods: one multiplies two numbers, and the other calls multiply to square a number.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The method call to
multiplyinsidesquare. - How many times: Exactly once per call to
square.
Each time we call square, it calls multiply once. The work done depends only on the number of calls, not the size of the numbers.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 method calls, 10 multiplications |
| 100 | 100 method calls, 100 multiplications |
| 1000 | 1000 method calls, 1000 multiplications |
Pattern observation: The total work grows directly with the number of method calls.
Time Complexity: O(1)
This means if you call the method once, the total time is constant and does not grow with input size.
[X] Wrong: "Calling one method inside another makes the program run much slower, like multiplying the time by n squared."
[OK] Correct: Each method call adds a fixed amount of work, so the total time grows linearly, not squared, unless there are nested loops or recursive calls.
Understanding how method calls add up helps you explain how your code runs and scales. This skill shows you can think about program speed clearly and write efficient code.
"What if the multiply method called another method inside it? How would the time complexity change?"