Logical operations (and, or, not) in NumPy - Time & Space Complexity
We want to understand how the time it takes to do logical operations with numpy arrays changes as the arrays get bigger.
Specifically, how does the work grow when using and, or, and not on arrays?
Analyze the time complexity of the following code snippet.
import numpy as np
arr1 = np.random.choice([True, False], size=1000)
arr2 = np.random.choice([True, False], size=1000)
result_and = np.logical_and(arr1, arr2)
result_or = np.logical_or(arr1, arr2)
result_not = np.logical_not(arr1)
This code creates two arrays of True/False values and applies logical AND, OR, and NOT element-wise.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Element-wise logical operations on arrays.
- How many times: Once for each element in the arrays (n times, where n is array length).
As the array size grows, the number of logical checks grows directly with it.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 logical checks |
| 100 | About 100 logical checks |
| 1000 | About 1000 logical checks |
Pattern observation: The work grows in a straight line with the input size.
Time Complexity: O(n)
This means the time to do logical operations grows directly with the number of elements.
[X] Wrong: "Logical operations on arrays are instant and don't depend on size."
[OK] Correct: Each element must be checked, so bigger arrays take more time.
Understanding how logical operations scale helps you write efficient code when working with large data sets.
"What if we used nested arrays (2D arrays) instead of 1D? How would the time complexity change?"