0
0
PyTorchml~10 mins

Why the training loop is explicit in PyTorch - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to start the training loop by iterating over the data loader.

PyTorch
for [1] in train_loader:
    inputs, labels = batch
    # training steps follow
Drag options to blanks, or click blank then click option'
Amodel
Bepoch
Cbatch
Doptimizer
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'epoch' instead of 'batch' as the loop variable.
Using 'model' or 'optimizer' which are not iterable.
2fill in blank
medium

Complete the code to zero the gradients before backpropagation.

PyTorch
optimizer.[1]()
Drag options to blanks, or click blank then click option'
Azero_grad
Bstep
Cbackward
Deval
Attempts:
3 left
💡 Hint
Common Mistakes
Calling optimizer.step() before zeroing gradients.
Using backward() or eval() which are unrelated here.
3fill in blank
hard

Fix the error in the code by completing the line that performs backpropagation.

PyTorch
loss.[1]()
Drag options to blanks, or click blank then click option'
Astep
Bzero_grad
Cforward
Dbackward
Attempts:
3 left
💡 Hint
Common Mistakes
Using step() which updates parameters, not computes gradients.
Using zero_grad() which clears gradients.
Using forward() which runs the model.
4fill in blank
hard

Fill both blanks to update model parameters and calculate the loss.

PyTorch
optimizer.[1]()
loss = criterion(outputs, [2])
Drag options to blanks, or click blank then click option'
Astep
Blabels
Cinputs
Dzero_grad
Attempts:
3 left
💡 Hint
Common Mistakes
Using zero_grad() instead of step() to update parameters.
Using inputs instead of labels for loss calculation.
5fill in blank
hard

Fill all three blanks to complete a simple explicit training loop in PyTorch.

PyTorch
for [1] in range(num_epochs):
    for batch in train_loader:
        inputs, labels = batch
        optimizer.[2]()
        outputs = model(inputs)
        loss = criterion(outputs, [3])
        loss.backward()
        optimizer.step()
Drag options to blanks, or click blank then click option'
Aepoch
Bzero_grad
Clabels
Dbatch
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'batch' as the outer loop variable instead of 'epoch'.
Forgetting to zero gradients before backward().
Using inputs instead of labels for loss calculation.