Recall & Review
beginner
What is the main purpose of a training loop in PyTorch?
The training loop repeatedly feeds data to the model, calculates predictions, measures errors, and updates the model to improve its accuracy.
Click to reveal answer
beginner
Name the four key steps inside a typical PyTorch training loop.
1. Forward pass: compute predictions from inputs.<br>2. Compute loss: measure how wrong predictions are.<br>3. Backward pass: calculate gradients to learn.<br>4. Optimizer step: update model parameters to improve.
Click to reveal answer
intermediate
Why do we call optimizer.zero_grad() before backward() in PyTorch?
Because gradients accumulate by default, zero_grad() clears old gradients so only the current batch's gradients are used for updating.
Click to reveal answer
intermediate
What does model.train() do in a PyTorch training loop?
It sets the model to training mode, enabling behaviors like dropout and batch normalization to work correctly during training.
Click to reveal answer
beginner
How can you track model performance during training?
By calculating and printing metrics like loss and accuracy after each batch or epoch to see if the model is improving.
Click to reveal answer
Which PyTorch function computes the gradients during training?
✗ Incorrect
loss.backward() calculates the gradients needed for updating model parameters.
What is the purpose of optimizer.step() in the training loop?
✗ Incorrect
optimizer.step() updates the model's parameters based on the gradients computed.
Why do we set model.train() before the training loop?
✗ Incorrect
model.train() enables layers like dropout and batch normalization to behave correctly during training.
What happens if you forget to call optimizer.zero_grad() before backward()?
✗ Incorrect
Gradients accumulate by default, so forgetting zero_grad() causes wrong gradient sums.
Which of these is NOT part of a typical training loop?
✗ Incorrect
Model evaluation on test data is done outside the training loop, usually after training.
Describe the sequence of operations in a PyTorch training loop and why each step is important.
Think about how the model learns from data step by step.
You got /5 concepts.
Explain the role of model.train() and optimizer.zero_grad() in the training loop.
Consider what happens if these are not used.
You got /2 concepts.