0
0
PyTorchml~10 mins

Why checkpointing preserves progress in PyTorch - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to save the model's state dictionary.

PyTorch
torch.save(model[1], 'model.pth')
Drag options to blanks, or click blank then click option'
A.gradients
B.parameters()
C.weights
D.state_dict()
Attempts:
3 left
💡 Hint
Common Mistakes
Saving model.parameters() instead of state_dict()
Saving model.weights which does not exist
2fill in blank
medium

Complete the code to load the saved state dictionary into the model.

PyTorch
model.load_state_dict(torch.load('[1]'))
Drag options to blanks, or click blank then click option'
Astate.pth
Bmodel.pth
Cweights.pth
Dcheckpoint.pth
Attempts:
3 left
💡 Hint
Common Mistakes
Using a different filename than the one used to save
Forgetting to load the state dictionary
3fill in blank
hard

Fix the error in saving the optimizer state for checkpointing.

PyTorch
torch.save(optimizer[1], 'optimizer.pth')
Drag options to blanks, or click blank then click option'
A.gradients
B.parameters()
C.state_dict()
D.weights
Attempts:
3 left
💡 Hint
Common Mistakes
Saving optimizer.parameters() which does not exist
Saving optimizer.weights which is invalid
4fill in blank
hard

Fill both blanks to save both model and optimizer states in one checkpoint dictionary.

PyTorch
checkpoint = {'model': model[1], 'optimizer': optimizer[2], 'epoch': epoch}
torch.save(checkpoint, 'checkpoint.pth')
Drag options to blanks, or click blank then click option'
A.state_dict()
B.parameters()
C.gradients
D.weights
Attempts:
3 left
💡 Hint
Common Mistakes
Using different methods for model and optimizer
Using .parameters() or .weights instead of .state_dict()
5fill in blank
hard

Fill all three blanks to load checkpoint and resume training with model, optimizer, and epoch.

PyTorch
checkpoint = torch.load('checkpoint.pth')
model.load_state_dict(checkpoint['[1]'])
optimizer.load_state_dict(checkpoint['[2]'])
epoch = checkpoint['[3]']
Drag options to blanks, or click blank then click option'
Amodel
Boptimizer
Cepoch
Dstate
Attempts:
3 left
💡 Hint
Common Mistakes
Using wrong keys like 'state' instead of 'model'
Forgetting to load epoch to resume training correctly