Why linear algebra is MATLAB's core - Performance Analysis
We want to understand how MATLAB's main tasks grow in time as the size of data increases.
Specifically, we look at linear algebra operations that MATLAB uses a lot.
Analyze the time complexity of the following matrix multiplication code.
A = rand(n, n);
B = rand(n, n);
C = A * B;
This code multiplies two square matrices of size n by n.
Look at what repeats inside the matrix multiplication.
- Primary operation: Multiplying and adding elements for each entry in the result matrix.
- How many times: For each of the n rows and n columns, we do n multiplications and additions.
As n grows, the number of operations grows much faster.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 1,000 |
| 100 | About 1,000,000 |
| 1000 | About 1,000,000,000 |
Pattern observation: Operations grow roughly by the cube of n, so tripling n makes work much bigger.
Time Complexity: O(n^3)
This means the time needed grows very fast as the matrix size grows, because each element depends on many calculations.
[X] Wrong: "Matrix multiplication takes time proportional to n, just like a simple loop."
[OK] Correct: Actually, matrix multiplication involves three nested loops, so the work grows much faster than just n.
Knowing how matrix operations scale helps you explain performance in many MATLAB tasks, showing you understand core computing ideas.
"What if we multiply a matrix of size n by a matrix of size n by m? How would the time complexity change?"