0
0
PyTorchml~20 mins

Epoch-based training in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Epoch Training Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of a simple epoch training loop
What will be the printed output after running this PyTorch training loop for 2 epochs?

Assume the model, optimizer, and loss function are correctly defined and the dataloader has 3 batches with dummy data.

Code:
PyTorch
for epoch in range(2):
    total_loss = 0
    for batch in range(3):
        loss = batch + 1  # dummy loss values: 1, 2, 3
        total_loss += loss
    print(f"Epoch {epoch+1}, Loss: {total_loss}")
A
Epoch 1, Loss: 6
Epoch 2, Loss: 6
B
Epoch 1, Loss: 3
Epoch 2, Loss: 3
C
Epoch 1, Loss: 1
Epoch 2, Loss: 2
D
Epoch 1, Loss: 6
Epoch 2, Loss: 9
Attempts:
2 left
💡 Hint
Sum the dummy loss values for each batch inside the epoch.
Model Choice
intermediate
1:30remaining
Choosing the correct place to reset loss in epoch training
In an epoch-based training loop, where should you reset the total loss variable to zero to correctly track loss per epoch?
ABefore the start of each batch loop
BBefore the start of each epoch loop
CAfter the end of all epochs
DInside the optimizer step
Attempts:
2 left
💡 Hint
Think about when you want to start fresh loss tracking for each epoch.
Hyperparameter
advanced
1:30remaining
Effect of increasing number of epochs on model training
What is the most likely effect of increasing the number of epochs during training of a neural network?
AModel may overfit the training data if epochs are too high
BModel will always perform better on unseen data
CTraining time decreases as epochs increase
DLoss will increase on training data as epochs increase
Attempts:
2 left
💡 Hint
Think about what happens if the model learns too much from training data.
Metrics
advanced
1:30remaining
Calculating average loss per epoch
Given a training loop that sums batch losses in total_loss and processes 5 batches per epoch, how do you calculate the average loss per epoch?
AMultiply total_loss by number of batches (5)
BSubtract number of batches (5) from total_loss
CAdd number of batches (5) to total_loss
DDivide total_loss by number of batches (5)
Attempts:
2 left
💡 Hint
Average means total divided by count.
🔧 Debug
expert
2:30remaining
Identifying the bug in epoch training loop
What error will this PyTorch epoch training loop raise?

Code:
for epoch in range(3): total_loss = 0 for batch in dataloader: optimizer.zero_grad() outputs = model(batch[0]) loss = criterion(outputs, batch[1]) loss.backward() optimizer.step() total_loss += loss print(f"Epoch {epoch+1} Loss: {total_loss}")
ARuntimeError: loss.backward() called multiple times
BNameError: name 'optimizer' is not defined
CTypeError: unsupported operand type(s) for +=: 'int' and 'Tensor'
DNo error, code runs correctly
Attempts:
2 left
💡 Hint
Check the type of loss and total_loss before adding.