Matrix dimensions in R Programming - Time & Space Complexity
When working with matrices, it is important to understand how the size affects the time it takes to perform operations.
We want to know how the time grows as the matrix gets bigger.
Analyze the time complexity of the following code snippet.
matrix_multiply <- function(A, B) {
n <- nrow(A)
m <- ncol(A)
p <- ncol(B)
result <- matrix(0, n, p)
for (i in 1:n) {
for (j in 1:p) {
result[i, j] <- sum(A[i, ] * B[, j])
}
}
return(result)
}
This code multiplies two matrices A and B by calculating each element of the result matrix.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Nested loops over rows of A and columns of B, plus a sum over a row and column.
- How many times: Outer loops run n * p times; inside each, sum runs over the shared dimension (number of columns of A or rows of B).
As the size of the matrices grows, the number of operations grows quickly because of the nested loops and sums.
| Input Size (n = p = shared dimension) | Approx. Operations |
|---|---|
| 10 | About 10 * 10 * 10 = 1,000 |
| 100 | About 100 * 100 * 100 = 1,000,000 |
| 1000 | About 1000 * 1000 * 1000 = 1,000,000,000 |
Pattern observation: The operations grow very fast, roughly by the cube of the input size.
Time Complexity: O(n^3)
This means if the matrix size doubles, the time to multiply grows about eight times.
[X] Wrong: "Matrix multiplication time grows only as the square of the size because it looks like two loops."
[OK] Correct: There is a third loop hidden in the sum operation, so the time actually grows with the cube of the size.
Understanding how matrix size affects time helps you explain performance in real tasks like graphics or data analysis.
"What if we used a more efficient algorithm that reduces the number of multiplications? How would the time complexity change?"