0
0
PyTorchml~5 mins

Model packaging (.mar files) in PyTorch

Choose your learning style9 modes available
Introduction

Model packaging into .mar files helps you bundle your PyTorch model and its code so it can be easily shared and deployed.

You want to deploy a PyTorch model as a web service.
You need to share your trained model with others in a ready-to-use format.
You want to use TorchServe to serve your model for predictions.
You want to keep your model and its code together for easy version control.
Syntax
PyTorch
torch-model-archiver --model-name <model_name> --version <version> --serialized-file <model_file> --handler <handler_file> --extra-files <extra_files> --export-path <export_path>

Replace placeholders like <model_name> with your actual model name.

The --handler is a Python script or a built-in handler that defines how to load and run your model.

Examples
This packages a ResNet18 model saved in resnet18.pt using the built-in image_classifier handler and saves the .mar file in model_store.
PyTorch
torch-model-archiver --model-name resnet18 --version 1.0 --serialized-file resnet18.pt --handler image_classifier --export-path model_store
This packages a custom model with a custom handler and an extra JSON file needed for predictions.
PyTorch
torch-model-archiver --model-name mymodel --version 2.0 --serialized-file model.pt --handler my_handler.py --extra-files index_to_name.json --export-path model_store
Sample Model

This example saves a pretrained ResNet18 model and shows the command to package it into a .mar file for TorchServe deployment.

PyTorch
import torch
from torchvision import models

# Step 1: Save a pretrained model
model = models.resnet18(pretrained=True)
model.eval()

# Save the model as TorchScript for TorchServe
example_input = torch.randn(1, 3, 224, 224)
traced_model = torch.jit.trace(model, example_input)
traced_model.save('resnet18.pt')

# Step 2: Use torch-model-archiver command (run in terminal, not Python)
# torch-model-archiver --model-name resnet18 --version 1.0 --serialized-file resnet18.pt --handler image_classifier --export-path model_store

# Step 3: After packaging, you get resnet18.mar in model_store folder

print('Model saved as resnet18.pt')
print('Run torch-model-archiver command in terminal to create .mar file')
OutputSuccess
Important Notes

The torch-model-archiver tool is a command-line utility, so you run it outside Python in your terminal.

Make sure to install TorchServe and torch-model-archiver via pip before packaging.

The .mar file contains your model and code, making deployment easier and consistent.

Summary

Packaging models into .mar files bundles model and code for easy deployment.

Use torch-model-archiver command with model file and handler script.

.mar files are used by TorchServe to serve models as APIs.