QR decomposition in SciPy - Time & Space Complexity
We want to understand how the time needed to perform QR decomposition changes as the size of the matrix grows.
Specifically, how does the work increase when we have bigger matrices?
Analyze the time complexity of the following code snippet.
import numpy as np
from scipy.linalg import qr
A = np.random.rand(100, 100)
Q, R = qr(A)
This code creates a 100x100 matrix and computes its QR decomposition using SciPy.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Multiplying and subtracting vectors and matrices during the decomposition process.
- How many times: These operations happen repeatedly for each column and row, roughly proportional to the cube of the matrix size.
As the matrix size grows, the number of calculations grows much faster.
| Input Size (n x n) | Approx. Operations |
|---|---|
| 10 x 10 | About 1,000 operations |
| 100 x 100 | About 1,000,000 operations |
| 1000 x 1000 | About 1,000,000,000 operations |
Pattern observation: When the matrix size doubles, the work increases roughly by eight times.
Time Complexity: O(n^3)
This means the time needed grows roughly with the cube of the matrix size, so bigger matrices take much longer to process.
[X] Wrong: "QR decomposition takes time proportional to the number of elements in the matrix (n squared)."
[OK] Correct: The process involves many matrix multiplications and vector operations that multiply the work, making it grow faster than just the number of elements.
Understanding how QR decomposition scales helps you explain the cost of matrix operations clearly, a useful skill in data science and engineering roles.
"What if we only decomposed a rectangular matrix with many more rows than columns? How would the time complexity change?"