Complete the code to create a tensor that tracks gradients.
import torch x = torch.tensor([1.0, 2.0, 3.0], requires_grad=[1])
Setting requires_grad=True tells PyTorch to track operations on the tensor for gradient computation.
Complete the code to stop gradient tracking on a tensor.
import torch x = torch.tensor([1.0, 2.0, 3.0], requires_grad=True) x = x.detach() x.requires_grad = [1]
Setting requires_grad = False disables gradient tracking for the tensor.
Fix the error in the code to enable gradient tracking on a new tensor copied from x.
import torch x = torch.tensor([1.0, 2.0, 3.0]) y = x.clone() y.requires_grad = [1]
After cloning, set requires_grad=True to track gradients on the new tensor.
Fill both blanks to create a tensor with gradient tracking and then disable it.
import torch x = torch.tensor([4.0, 5.0], requires_grad=[1]) x.requires_grad = [2]
First, create the tensor with requires_grad=True to track gradients, then disable it by setting requires_grad=False.
Fill all three blanks to create a tensor, clone it with gradients, and then disable gradients on the clone.
import torch x = torch.tensor([7.0, 8.0], requires_grad=[1]) y = x.clone() y.requires_grad = [2] z = y.detach() z.requires_grad = [3]
Create x with gradient tracking enabled (True), clone it to y which also tracks gradients (True), then detach z from the graph and disable gradients (False).