0
0
PyTorchml~10 mins

Backward pass (loss.backward) in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to perform the backward pass and compute gradients.

PyTorch
loss.[1]()
Drag options to blanks, or click blank then click option'
Abackward
Bforward
Cstep
Dzero_grad
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'forward()' instead of 'backward()' which only runs the forward pass.
Calling 'step()' which updates parameters but does not compute gradients.
2fill in blank
medium

Complete the code to reset gradients before the backward pass.

PyTorch
optimizer.[1]()
Drag options to blanks, or click blank then click option'
Azero_grad
Bstep
Cbackward
Dupdate
Attempts:
3 left
💡 Hint
Common Mistakes
Calling 'step()' before 'backward()' which updates parameters prematurely.
Not resetting gradients causing incorrect gradient sums.
3fill in blank
hard

Fix the error in the code to correctly perform a training step.

PyTorch
optimizer.zero_grad()
output = model(input)
loss = criterion(output, target)
loss.[1]()
optimizer.step()
Drag options to blanks, or click blank then click option'
Azero_grad
Bforward
Cstep
Dbackward
Attempts:
3 left
💡 Hint
Common Mistakes
Calling 'loss.step()' which does not exist.
Forgetting to call 'loss.backward()' causing no gradient computation.
4fill in blank
hard

Fill both blanks to correctly compute loss and perform backward pass.

PyTorch
output = model(input)
loss = [1](output, target)
loss.[2]()
Drag options to blanks, or click blank then click option'
Acriterion
Boptimizer
Cbackward
Dforward
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'optimizer' instead of 'criterion' to compute loss.
Calling 'forward()' instead of 'backward()' on loss.
5fill in blank
hard

Fill all three blanks to complete a training step with gradient reset, loss computation, and backward pass.

PyTorch
optimizer.[1]()
output = model(input)
loss = [2](output, target)
loss.[3]()
Drag options to blanks, or click blank then click option'
Azero_grad
Bcriterion
Cbackward
Dstep
Attempts:
3 left
💡 Hint
Common Mistakes
Calling 'step()' before 'backward()' which updates parameters too early.
Not resetting gradients causing accumulation errors.