0
0
NumPydata~5 mins

Common broadcasting patterns in NumPy - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: Common broadcasting patterns
O(n*m)
Understanding Time Complexity

When using NumPy, broadcasting lets us do math on arrays of different shapes without extra loops.

We want to know how the time to run these operations grows as arrays get bigger.

Scenario Under Consideration

Analyze the time complexity of the following code snippet.

import numpy as np

A = np.ones((1000, 1000))
B = np.arange(1000).reshape(1000, 1)

C = A + B  # Broadcasting B to match A's shape

This code adds a 1000x1000 array to a 1000x1 array by broadcasting the smaller array.

Identify Repeating Operations

Identify the loops, recursion, array traversals that repeat.

  • Primary operation: Element-wise addition over the full 1000x1000 array.
  • How many times: Once for each of the 1,000,000 elements in the result.
How Execution Grows With Input

As the arrays get bigger, the number of additions grows with the total number of elements in the output.

Input Size (n)Approx. Operations
10100 (10x10)
10010,000 (100x100)
10001,000,000 (1000x1000)

Pattern observation: Operations grow roughly with the product of the output array dimensions.

Final Time Complexity

Time Complexity: O(n*m)

This means the time grows proportionally to the total number of elements in the output array.

Common Mistake

[X] Wrong: "Broadcasting makes the operation run only as fast as the smaller array."

[OK] Correct: Even though broadcasting avoids copying data, the operation still processes every element in the output array, so time depends on the output size.

Interview Connect

Understanding broadcasting time helps you explain how vectorized code scales and why it's efficient compared to loops.

Self-Check

What if we changed the smaller array to have shape (1, m) instead of (n, 1)? How would the time complexity change?