Median and uniform filters in SciPy - Time & Space Complexity
When using median and uniform filters, it is important to know how the time to process data grows as the data size increases.
We want to understand how the filtering time changes when the input array gets bigger.
Analyze the time complexity of the following code snippet.
import numpy as np
from scipy.ndimage import median_filter, uniform_filter
arr = np.random.rand(1000)
filtered_median = median_filter(arr, size=3)
filtered_uniform = uniform_filter(arr, size=3)
This code applies median and uniform filters of size 3 to a 1D array of 1000 elements.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: For each element, the filter looks at a small window of neighbors and computes either the median or the average.
- How many times: This operation repeats once for every element in the input array.
As the input array size grows, the filter must process more elements, each requiring a small fixed number of operations.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 times the window size operations |
| 100 | About 100 times the window size operations |
| 1000 | About 1000 times the window size operations |
Pattern observation: The total work grows roughly in direct proportion to the input size.
Time Complexity: O(n * k) where k is the window size
This means the time to filter grows linearly with the number of elements in the input array and the window size.
[X] Wrong: "Median filtering takes much longer than uniform filtering because it sorts the window every time."
[OK] Correct: Although median filtering does more work per element, both filters still process each element once, so their time grows linearly with input size. The difference is in constant factors, not the overall growth pattern.
Understanding how filtering operations scale helps you explain performance in data processing tasks clearly and confidently.
What if we increased the filter size from 3 to 7? How would the time complexity change?