Math-related operations in Python - Time & Space Complexity
When we use math operations in code, it's important to see how the time to run grows as numbers get bigger.
We want to know: does the program take longer if the numbers are larger or if we do more math steps?
Analyze the time complexity of the following code snippet.
def sum_of_squares(n):
total = 0
for i in range(1, n + 1):
total += i * i
return total
This code adds up the squares of numbers from 1 to n.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Multiplying each number by itself and adding it to total.
- How many times: This happens once for every number from 1 to n.
As n gets bigger, the number of math steps grows in a straight line with n.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 multiplications and additions |
| 100 | 100 multiplications and additions |
| 1000 | 1000 multiplications and additions |
Pattern observation: Doubling n doubles the work needed.
Time Complexity: O(n)
This means the time to finish grows directly with the size of n.
[X] Wrong: "Math operations like multiplication take longer as numbers get bigger."
[OK] Correct: In most programming languages, basic math operations take about the same time regardless of the number size within normal limits.
Understanding how math operations affect time helps you explain your code clearly and shows you think about efficiency.
"What if we changed the code to calculate the sum of squares for every pair of numbers from 1 to n? How would the time complexity change?"