Recall & Review
beginner
What does the
.grad attribute in PyTorch represent?The
.grad attribute holds the gradient of a tensor after backpropagation. It shows how much the tensor affects the final output.Click to reveal answer
beginner
When is the
.grad attribute populated in PyTorch?The
.grad attribute is filled after calling backward() on a tensor that is part of a computation graph with requires_grad=True.Click to reveal answer
beginner
How do you enable gradient tracking for a tensor in PyTorch?
Set
requires_grad=True when creating the tensor, for example: torch.tensor([1.0, 2.0], requires_grad=True).Click to reveal answer
beginner
What happens if you try to access
.grad before calling backward()?The
.grad attribute will be None because gradients have not been computed yet.Click to reveal answer
intermediate
How can you clear gradients stored in
.grad before the next backward pass?Use
optimizer.zero_grad() or manually set tensor.grad.zero_() to reset gradients to zero.Click to reveal answer
What must be set to
True for a tensor to track gradients in PyTorch?✗ Incorrect
The
requires_grad flag enables gradient tracking for tensors.When is the
.grad attribute of a tensor updated?✗ Incorrect
Gradients are computed and stored in
.grad after backward() is called.What is the value of
.grad before any backward pass?✗ Incorrect
Before backward,
.grad is None because no gradients are computed yet.How do you reset gradients in PyTorch before a new training step?
✗ Incorrect
Calling
optimizer.zero_grad() clears gradients to avoid accumulation.Which of these tensors will have a
.grad attribute after backward?✗ Incorrect
Only tensors with
requires_grad=True track gradients and have .grad populated.Explain how to access and use the
.grad attribute in PyTorch during training.Think about the steps in a training loop involving gradients.
You got /4 concepts.
Describe what happens if you forget to reset gradients before the next backward pass in PyTorch.
Consider how gradients behave across multiple backward calls.
You got /4 concepts.