Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to access the gradient of tensor x after backward pass.
PyTorch
import torch x = torch.tensor(2.0, requires_grad=True) y = x * x loss = y loss.backward() print(x[1])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using .data instead of .grad to access gradients.
Trying to access gradient before calling backward().
✗ Incorrect
The .grad attribute holds the gradient of the tensor after backward() is called.
2fill in blank
mediumComplete the code to zero out the gradients of tensor x.
PyTorch
import torch x = torch.tensor(3.0, requires_grad=True) y = x * 3 loss = y loss.backward() x.grad[1]()
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using zero() instead of zero_(), which does not modify in-place.
Trying to assign None to x.grad instead of zeroing.
✗ Incorrect
The zero_() method resets the gradient tensor to zero in-place.
3fill in blank
hardFix the error in accessing the gradient of tensor z after backward pass.
PyTorch
import torch z = torch.tensor(4.0, requires_grad=True) out = z ** 3 out.backward() print(z[1])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using .grad_fn which gives the function that created the tensor, not the gradient.
Trying to access .data which is the raw tensor data, not the gradient.
✗ Incorrect
The correct way to access the gradient is z.grad after backward().
4fill in blank
hardFill both blanks to compute the gradient of y = x^2 and print it.
PyTorch
import torch x = torch.tensor(5.0, requires_grad=True) y = x[1]2 loss = y loss.backward() print(x[2])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using * instead of ** for power operation.
Accessing .data instead of .grad for gradients.
✗ Incorrect
Use ** for power and .grad to access the gradient after backward().
5fill in blank
hardFill all three blanks to compute gradients for z = a * b and print gradients of a and b.
PyTorch
import torch a = torch.tensor(2.0, requires_grad=True) b = torch.tensor(3.0, requires_grad=True) z = a[1]b z.backward() print(a[2]) print(b[3])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using + instead of * for multiplication.
Trying to access .data instead of .grad for gradients.
✗ Incorrect
Use * for multiplication and .grad to access gradients of both tensors.