Complete the code to save the model checkpoint including the optimizer state.
torch.save({'model_state_dict': model.state_dict(), 'optimizer_state_dict': [1], 'checkpoint.pth')To save the optimizer state, use optimizer.state_dict() inside the checkpoint dictionary.
Complete the code to load the optimizer state from the checkpoint.
checkpoint = torch.load('checkpoint.pth') optimizer.[1](checkpoint['optimizer_state_dict'])
To restore the optimizer state, use optimizer.load_state_dict() with the saved state dictionary.
Fix the error in the code to correctly save the checkpoint with model and optimizer states.
torch.save({'model': model.state_dict(), 'optimizer': [1], 'checkpoint.pth')The checkpoint dictionary keys can be any string, but the value for optimizer must be optimizer.state_dict() to save its state.
Fill both blanks to correctly load the model and optimizer states from the checkpoint.
checkpoint = torch.load('checkpoint.pth') model.[1](checkpoint['model']) optimizer.[2](checkpoint['optimizer'])
Both model and optimizer use load_state_dict() to restore their saved states from the checkpoint.
Fill all three blanks to save a checkpoint with model state, optimizer state, and current epoch.
torch.save({'model_state_dict': model.[1](), 'optimizer_state_dict': optimizer.[2](), 'epoch': [3], 'checkpoint.pth')Use state_dict() method to get model and optimizer states. The epoch number is saved as a variable, here named 'epoch'.