Challenge - 5 Problems
Broadcasting Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
What is the output shape after broadcasting?
Given two tensors
a and b in PyTorch, what is the shape of the result after adding them?PyTorch
import torch a = torch.randn(3, 1, 5) b = torch.randn(1, 4, 1) result = a + b output_shape = result.shape print(output_shape)
Attempts:
2 left
💡 Hint
Remember PyTorch broadcasts dimensions by expanding size 1 dimensions to match the other tensor.
✗ Incorrect
Tensor
a has shape (3,1,5) and b has shape (1,4,1). Broadcasting expands the 1s to match the other tensor's size, resulting in (3,4,5).❓ Model Choice
intermediate2:00remaining
Which operation uses broadcasting correctly?
Which PyTorch operation below correctly uses broadcasting to multiply a tensor of shape (2,3) with a tensor of shape (3,)?
Attempts:
2 left
💡 Hint
Broadcasting works when trailing dimensions match or are 1.
✗ Incorrect
Option B multiplies (2,3) by (3,) which broadcasts the (3,) to (2,3). Other options have incompatible shapes for broadcasting.
❓ Hyperparameter
advanced2:00remaining
How does broadcasting affect batch size in model input?
You have a model expecting input shape (batch_size, features). You provide input tensor of shape (features,) without batch dimension. What happens when you add a batch dimension using broadcasting?
Attempts:
2 left
💡 Hint
Broadcasting can expand size 1 dimensions but cannot add missing dimensions.
✗ Incorrect
Broadcasting can only expand existing dimensions of size 1. It cannot add a new batch dimension if missing. You must reshape or unsqueeze manually.
🔧 Debug
advanced2:00remaining
Why does this broadcasting operation raise an error?
Consider the code below. Why does it raise a runtime error?
PyTorch
import torch x = torch.randn(4, 3) y = torch.randn(2, 3) z = x + y
Attempts:
2 left
💡 Hint
Check the rules for broadcasting dimensions from left to right.
✗ Incorrect
Broadcasting requires dimensions to be equal or one of them to be 1. Here, 4 and 2 differ and neither is 1, so addition fails.
🧠 Conceptual
expert2:00remaining
What is the effect of broadcasting on memory usage?
When PyTorch broadcasts a tensor during an operation, what happens to the underlying memory usage?
Attempts:
2 left
💡 Hint
Think about how broadcasting avoids unnecessary data duplication.
✗ Incorrect
PyTorch uses strides and views to simulate expanded shapes without copying data, so memory usage remains efficient.