Image filtering (gaussian_filter) in SciPy - Time & Space Complexity
We want to understand how the time to apply a Gaussian filter changes as the image size grows.
How does the filtering cost grow when the image gets bigger?
Analyze the time complexity of the following code snippet.
import numpy as np
from scipy.ndimage import gaussian_filter
image = np.random.rand(512, 512)
filtered_image = gaussian_filter(image, sigma=1)
This code applies a Gaussian blur to a 2D image array using scipy's gaussian_filter function.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Applying the Gaussian kernel to each pixel by combining neighboring pixels.
- How many times: Once for every pixel in the image, involving a small fixed neighborhood around it.
The time grows roughly in proportion to the number of pixels in the image.
| Input Size (n x n) | Approx. Operations |
|---|---|
| 10 x 10 | About 100 times a small fixed cost |
| 100 x 100 | About 10,000 times a small fixed cost |
| 1000 x 1000 | About 1,000,000 times a small fixed cost |
Pattern observation: Doubling the image width and height roughly quadruples the work because the total pixels increase by the square.
Time Complexity: O(n^2)
This means the time to filter grows proportionally to the total number of pixels in the image.
[X] Wrong: "The time depends on the size of the Gaussian kernel only, not the image size."
[OK] Correct: The kernel size is fixed and small, but the filter must be applied to every pixel, so the image size dominates the time.
Understanding how image size affects filtering time helps you reason about performance in real projects and shows you can analyze common data science operations.
"What if we applied gaussian_filter to a 3D image (like a color image)? How would the time complexity change?"