Spectrogram generation in SciPy - Time & Space Complexity
When we create a spectrogram, we split a signal into pieces and analyze each piece.
We want to know how the time to make a spectrogram grows as the signal gets longer.
Analyze the time complexity of the following code snippet.
import numpy as np
from scipy.signal import spectrogram
fs = 1000 # Sampling frequency
signal = np.random.randn(10000) # Example signal
frequencies, times, Sxx = spectrogram(signal, fs=fs, nperseg=256)
This code computes the spectrogram of a signal by splitting it into overlapping segments and applying Fourier transforms.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Computing the Fourier transform on each segment of the signal.
- How many times: The number of segments, which depends on the signal length and segment size.
As the signal length grows, the number of segments grows roughly in proportion.
| Input Size (n) | Approx. Operations |
|---|---|
| 10,000 | About 40 Fourier transforms (10000/256) |
| 100,000 | About 390 Fourier transforms (100000/256) |
| 1,000,000 | About 3906 Fourier transforms (1000000/256) |
Pattern observation: The total work grows roughly linearly with the signal length.
Time Complexity: O(n log m)
This means the time grows roughly proportional to the signal length times the log of the segment size.
[X] Wrong: "The spectrogram time grows only with the signal length, ignoring the cost of Fourier transforms."
[OK] Correct: Each segment requires a Fourier transform, which takes extra time depending on segment size, so both length and segment size matter.
Understanding how spectrogram computation scales helps you explain performance in signal processing tasks clearly and confidently.
"What if we change the segment size to be larger? How would the time complexity change?"