0
0
PyTorchml~10 mins

Zeroing gradients in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to zero the gradients of the optimizer before the backward pass.

PyTorch
optimizer.[1]()
Drag options to blanks, or click blank then click option'
Azero_grad
Bstep
Cbackward
Dzero
Attempts:
3 left
💡 Hint
Common Mistakes
Using optimizer.step() instead of zero_grad() to clear gradients.
Calling backward() on the optimizer which is incorrect.
2fill in blank
medium

Complete the code to zero the gradients of the model's parameters before backpropagation.

PyTorch
for param in model.parameters():
    if param.grad is not None:
        param.grad.[1]()
Drag options to blanks, or click blank then click option'
Azero_grad
Bzero_
Cclear
Dreset
Attempts:
3 left
💡 Hint
Common Mistakes
Using zero_grad() on the gradient tensor which does not exist.
Using clear() or reset() which are not valid tensor methods.
3fill in blank
hard

Fix the error in zeroing gradients before the backward pass.

PyTorch
optimizer.[1]()
loss.backward()
optimizer.step()
Drag options to blanks, or click blank then click option'
Azero_grad
Bstep
Cbackward
Dclear_grad
Attempts:
3 left
💡 Hint
Common Mistakes
Calling optimizer.step() before zeroing gradients.
Using a non-existent method like clear_grad().
4fill in blank
hard

Fill both blanks to zero gradients for all parameters in the model.

PyTorch
for param in model.parameters():
    if param.grad is not None:
        param.grad.[1]()
model.[2]()
Drag options to blanks, or click blank then click option'
Azero_
Bstep
Czero_grad
Dbackward
Attempts:
3 left
💡 Hint
Common Mistakes
Using step() instead of zero_grad() to clear gradients.
Confusing backward() with zeroing gradients.
5fill in blank
hard

Fill all three blanks to zero gradients, perform backward pass, and update weights.

PyTorch
optimizer.[1]()
loss.[2]()
optimizer.[3]()
Drag options to blanks, or click blank then click option'
Astep
Bbackward
Czero_grad
Dclear
Attempts:
3 left
💡 Hint
Common Mistakes
Calling step() before zero_grad() or backward().
Using clear() which is not a valid optimizer method.