Recall & Review
beginner
What is the purpose of the forward pass in a neural network?
The forward pass calculates the output of the network by passing input data through its layers. It shows how the input transforms into predictions.
Click to reveal answer
beginner
What does the loss function measure in training a model?
The loss function measures how far the model's predictions are from the true answers. It gives a number showing the error to minimize.
Click to reveal answer
intermediate
What happens during the backward pass (backpropagation)?
During the backward pass, the model calculates gradients that show how to change each weight to reduce the loss. It moves backward through the network.
Click to reveal answer
beginner
What does the optimizer's step() function do?
The step() function updates the model's weights using the gradients calculated in the backward pass. It moves weights to reduce the loss.
Click to reveal answer
intermediate
Why do we zero the gradients before the backward pass?
We zero the gradients to clear old gradient values. Otherwise, gradients would accumulate and give wrong updates.
Click to reveal answer
What is the first step in training a neural network?
✗ Incorrect
Training starts by passing input data through the network to get predictions, called the forward pass.
Which function calculates how wrong the model's predictions are?
✗ Incorrect
The loss function measures the difference between predictions and true values.
What does backward() do in PyTorch?
✗ Incorrect
backward() computes gradients of the loss with respect to model parameters.
Why do we call optimizer.zero_grad() before backward()?
✗ Incorrect
zero_grad() clears previous gradients to avoid accumulation.
What does optimizer.step() do?
✗ Incorrect
optimizer.step() updates the model's weights using the gradients.
Explain the sequence of steps in one training iteration of a PyTorch model including forward pass, loss, backward, and step.
Think about what happens from input to weight update.
You got /5 concepts.
Why is it important to zero gradients before calling backward in PyTorch training?
Consider what happens if you don't clear gradients.
You got /3 concepts.