Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to start the training loop by iterating over the data loader.
PyTorch
for [1] in train_loader: inputs, labels = batch # training steps follow
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'epoch' instead of 'batch' as the loop variable.
Using 'model' or 'optimizer' which are not iterable.
✗ Incorrect
The training loop iterates over batches of data from the train_loader. 'batch' is the variable holding each batch.
2fill in blank
mediumComplete the code to zero the gradients before backpropagation.
PyTorch
optimizer.[1]() Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Calling optimizer.step() before zeroing gradients.
Using backward() or eval() which are unrelated here.
✗ Incorrect
Before computing gradients, we clear old gradients with optimizer.zero_grad().
3fill in blank
hardFix the error in the code by completing the line that performs backpropagation.
PyTorch
loss.[1]() Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using step() which updates parameters, not computes gradients.
Using zero_grad() which clears gradients.
Using forward() which runs the model.
✗ Incorrect
Calling loss.backward() computes gradients for backpropagation.
4fill in blank
hardFill both blanks to update model parameters and calculate the loss.
PyTorch
optimizer.[1]() loss = criterion(outputs, [2])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using zero_grad() instead of step() to update parameters.
Using inputs instead of labels for loss calculation.
✗ Incorrect
optimizer.step() updates parameters; loss is calculated comparing outputs to labels.
5fill in blank
hardFill all three blanks to complete a simple explicit training loop in PyTorch.
PyTorch
for [1] in range(num_epochs): for batch in train_loader: inputs, labels = batch optimizer.[2]() outputs = model(inputs) loss = criterion(outputs, [3]) loss.backward() optimizer.step()
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'batch' as the outer loop variable instead of 'epoch'.
Forgetting to zero gradients before backward().
Using inputs instead of labels for loss calculation.
✗ Incorrect
The outer loop uses 'epoch', gradients are zeroed with zero_grad(), and loss compares outputs to labels.