Complete the code to save a PyTorch model's state dictionary.
torch.save(model[1], 'model.pth')
We save the model's state dictionary using model.state_dict() to capture the learned parameters.
Complete the command to create a .mar file using torch-model-archiver.
torch-model-archiver --model-name my_model --version 1.0 --serialized-file [1] --handler handler.py --export-path model_store
The serialized file is the saved model weights file, typically named model.pth.
Fix the error in loading a model from a .mar file using TorchServe.
torchserve --start --model-store model_store --models my_model=[1]When starting TorchServe, the model must be referenced by its .mar file name.
Fill both blanks to define a custom handler class for TorchServe.
from ts.torch_handler.base_handler import {{BLANK_1 }} class CustomHandler({{BLANK_2}}): def preprocess(self, data): pass
The custom handler should inherit from BaseHandler to implement TorchServe methods.
Fill all three blanks to create a dictionary for model metadata in the .mar packaging process.
model_info = {
'model_name': [1],
'model_version': [2],
'handler': [3]
}The dictionary keys hold the model name, version, and handler script file name as strings.