0
0
PyTorchml~5 mins

Forward pass, loss, backward, step in PyTorch - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is the purpose of the forward pass in a neural network?
The forward pass calculates the output of the network by passing input data through its layers. It shows how the input transforms into predictions.
Click to reveal answer
beginner
What does the loss function measure in training a model?
The loss function measures how far the model's predictions are from the true answers. It gives a number showing the error to minimize.
Click to reveal answer
intermediate
What happens during the backward pass (backpropagation)?
During the backward pass, the model calculates gradients that show how to change each weight to reduce the loss. It moves backward through the network.
Click to reveal answer
beginner
What does the optimizer's step() function do?
The step() function updates the model's weights using the gradients calculated in the backward pass. It moves weights to reduce the loss.
Click to reveal answer
intermediate
Why do we zero the gradients before the backward pass?
We zero the gradients to clear old gradient values. Otherwise, gradients would accumulate and give wrong updates.
Click to reveal answer
What is the first step in training a neural network?
APerforming the forward pass
BCalling optimizer.step()
CCalculating gradients
DZeroing the gradients
Which function calculates how wrong the model's predictions are?
Aoptimizer.step()
Bloss function
Cbackward()
Dzero_grad()
What does backward() do in PyTorch?
ACalculates gradients
BClears gradients
CUpdates weights
DComputes loss
Why do we call optimizer.zero_grad() before backward()?
ATo update weights
BTo reset weights
CTo calculate loss
DTo clear old gradients
What does optimizer.step() do?
ACalculates loss
BComputes gradients
CUpdates model weights
DClears gradients
Explain the sequence of steps in one training iteration of a PyTorch model including forward pass, loss, backward, and step.
Think about what happens from input to weight update.
You got /5 concepts.
    Why is it important to zero gradients before calling backward in PyTorch training?
    Consider what happens if you don't clear gradients.
    You got /3 concepts.