Matrix multiplication (%*%) in R Programming - Time & Space Complexity
Matrix multiplication is a common operation in many programs. Understanding how its time grows helps us write better code.
We want to know how the work needed changes when the matrix size gets bigger.
Analyze the time complexity of the following code snippet.
# Multiply two square matrices A and B of size n x n
n <- 3
A <- matrix(1:(n*n), n, n)
B <- matrix((n*n):1, n, n)
C <- A %*% B
This code multiplies two square matrices using R's built-in matrix multiplication operator.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Multiplying and summing elements for each cell in the result matrix.
- How many times: For each of the n rows and n columns, it repeats n multiplications and additions.
As the matrix size n grows, the number of operations grows quickly because each cell needs n calculations.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 1,000 operations |
| 100 | About 1,000,000 operations |
| 1000 | About 1,000,000,000 operations |
Pattern observation: The work grows very fast, roughly by the cube of n.
Time Complexity: O(n^3)
This means if the matrix size doubles, the work needed grows about eight times.
[X] Wrong: "Matrix multiplication takes only n squared steps because the result is an n by n matrix."
[OK] Correct: Each cell in the result needs n multiplications and additions, so the total steps multiply to n cubed.
Knowing how matrix multiplication scales helps you understand performance in many areas like graphics, data science, and simulations. It shows you how to think about work growing with input size.
"What if we multiply a matrix of size n x m by a matrix of size m x p? How would the time complexity change?"