0
0
PyTorchml~10 mins

Why automatic differentiation enables training in PyTorch - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a tensor that tracks gradients for training.

PyTorch
import torch
x = torch.tensor([2.0, 3.0], requires_grad=[1])
Drag options to blanks, or click blank then click option'
ATrue
BFalse
CNone
D0
Attempts:
3 left
💡 Hint
Common Mistakes
Setting requires_grad to False disables gradient tracking.
2fill in blank
medium

Complete the code to compute the gradient of y with respect to x.

PyTorch
y = x.pow(2).sum()
y.backward([1])
Drag options to blanks, or click blank then click option'
Aretain_graph=True
Bcreate_graph=True
Cinputs=x
DNone
Attempts:
3 left
💡 Hint
Common Mistakes
Passing unnecessary arguments to backward() when y is scalar.
3fill in blank
hard

Fix the error in the code to correctly zero gradients before backward pass.

PyTorch
optimizer.zero_grad()
loss = model(input).sum()
loss.[1]()
optimizer.step()
Drag options to blanks, or click blank then click option'
Abackward
Bstep
Czero_grad
Ddetach
Attempts:
3 left
💡 Hint
Common Mistakes
Calling step() or zero_grad() instead of backward() on loss.
4fill in blank
hard

Fill both blanks to create a simple training loop that updates model parameters.

PyTorch
for data, target in dataloader:
    optimizer.[1]()
    output = model(data)
    loss = criterion(output, target)
    loss.[2]()
    optimizer.step()
Drag options to blanks, or click blank then click option'
Azero_grad
Bbackward
Cstep
Ddetach
Attempts:
3 left
💡 Hint
Common Mistakes
Forgetting to zero gradients before backward pass.
5fill in blank
hard

Fill all three blanks to define a tensor, compute a function, and get its gradient.

PyTorch
x = torch.tensor([1.0, 2.0, 3.0], requires_grad=[1])
y = x.[2](2).sum()
y.[3]()
Drag options to blanks, or click blank then click option'
ATrue
Bpow
Cbackward
DFalse
Attempts:
3 left
💡 Hint
Common Mistakes
Not enabling gradient tracking or forgetting to call backward().