0
0
PyTorchml~10 mins

Gradient access (.grad) in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to access the gradient of tensor x after backward pass.

PyTorch
import torch
x = torch.tensor(2.0, requires_grad=True)
y = x * x
loss = y
loss.backward()
print(x[1])
Drag options to blanks, or click blank then click option'
A.grad
B.data
C.item()
D.requires_grad
Attempts:
3 left
💡 Hint
Common Mistakes
Using .data instead of .grad to access gradients.
Trying to access gradient before calling backward().
2fill in blank
medium

Complete the code to zero out the gradients of tensor x.

PyTorch
import torch
x = torch.tensor(3.0, requires_grad=True)
y = x * 3
loss = y
loss.backward()
x.grad[1]()
Drag options to blanks, or click blank then click option'
Aclear
Bzero_
Creset
Dzero
Attempts:
3 left
💡 Hint
Common Mistakes
Using zero() instead of zero_(), which does not modify in-place.
Trying to assign None to x.grad instead of zeroing.
3fill in blank
hard

Fix the error in accessing the gradient of tensor z after backward pass.

PyTorch
import torch
z = torch.tensor(4.0, requires_grad=True)
out = z ** 3
out.backward()
print(z[1])
Drag options to blanks, or click blank then click option'
A.data
B.grad_fn
C.grad.data
D.grad
Attempts:
3 left
💡 Hint
Common Mistakes
Using .grad_fn which gives the function that created the tensor, not the gradient.
Trying to access .data which is the raw tensor data, not the gradient.
4fill in blank
hard

Fill both blanks to compute the gradient of y = x^2 and print it.

PyTorch
import torch
x = torch.tensor(5.0, requires_grad=True)
y = x[1]2
loss = y
loss.backward()
print(x[2])
Drag options to blanks, or click blank then click option'
A**
B.grad
C.data
D*
Attempts:
3 left
💡 Hint
Common Mistakes
Using * instead of ** for power operation.
Accessing .data instead of .grad for gradients.
5fill in blank
hard

Fill all three blanks to compute gradients for z = a * b and print gradients of a and b.

PyTorch
import torch
a = torch.tensor(2.0, requires_grad=True)
b = torch.tensor(3.0, requires_grad=True)
z = a[1]b
z.backward()
print(a[2])
print(b[3])
Drag options to blanks, or click blank then click option'
A*
B.grad
D+
Attempts:
3 left
💡 Hint
Common Mistakes
Using + instead of * for multiplication.
Trying to access .data instead of .grad for gradients.