Recall & Review
beginner
What is broadcasting in PyTorch?
Broadcasting is a way PyTorch automatically expands smaller tensors to match the shape of larger tensors during arithmetic operations, without copying data.
Click to reveal answer
intermediate
How does PyTorch decide if two tensors can be broadcast together?
PyTorch compares tensor shapes from the right. Dimensions must be equal or one of them must be 1. If this holds for all dimensions, broadcasting works.
Click to reveal answer
beginner
Example: What is the result shape of adding a tensor of shape (3, 1) to a tensor of shape (1, 4)?
The result shape is (3, 4). The (3, 1) tensor is broadcast along the second dimension, and the (1, 4) tensor is broadcast along the first dimension.
Click to reveal answer
beginner
Why is broadcasting useful in machine learning?
Broadcasting lets you write simple code that works on tensors of different shapes, saving memory and making operations faster without manual reshaping.
Click to reveal answer
beginner
What happens if two tensors have incompatible shapes for broadcasting?
PyTorch raises a runtime error because it cannot automatically expand the tensors to a common shape for the operation.
Click to reveal answer
Which condition must be true for two tensor dimensions to be broadcast together?
✗ Incorrect
Broadcasting requires dimensions to be equal or one dimension to be 1.
What is the shape of the result when adding tensors of shapes (5, 1, 3) and (1, 4, 3)?
✗ Incorrect
Broadcasting expands both tensors to (5, 4, 3) before addition.
What error occurs if tensors cannot be broadcast together?
✗ Incorrect
PyTorch raises a RuntimeError when broadcasting fails.
Broadcasting helps to avoid which of the following?
✗ Incorrect
Broadcasting avoids manual reshaping, loops, and unnecessary data copying.
In PyTorch, broadcasting compares tensor shapes starting from which side?
✗ Incorrect
Broadcasting compares shapes starting from the rightmost dimension.
Explain broadcasting in PyTorch and why it is useful in simple terms.
Think about how PyTorch handles adding tensors of different shapes without extra work.
You got /4 concepts.
Describe the rules PyTorch uses to decide if two tensors can be broadcast together.
Focus on how PyTorch checks each dimension pair.
You got /3 concepts.