0
0
PyTorchml~20 mins

TorchScript export in PyTorch - ML Experiment: Train & Evaluate

Choose your learning style9 modes available
Experiment - TorchScript export
Problem:You have a PyTorch model trained for image classification. You want to export it using TorchScript to run it efficiently in a production environment without Python dependency.
Current Metrics:Training accuracy: 92%, Validation accuracy: 89%, Model is trained and working in PyTorch but not exported.
Issue:The model is not exported yet. You need to convert it to TorchScript format correctly to use it outside Python.
Your Task
Export the trained PyTorch model to TorchScript format using tracing or scripting, then load and run inference with the exported model to verify it produces the same predictions.
Do not retrain the model.
Use either torch.jit.trace or torch.jit.script for export.
Verify the exported model outputs match the original model outputs on sample input.
Hint 1
Hint 2
Hint 3
Solution
PyTorch
import torch
import torch.nn as nn

# Define a simple model for demonstration
class SimpleCNN(nn.Module):
    def __init__(self):
        super(SimpleCNN, self).__init__()
        self.conv = nn.Conv2d(3, 16, 3, stride=1, padding=1)
        self.relu = nn.ReLU()
        self.pool = nn.AdaptiveAvgPool2d((1,1))
        self.fc = nn.Linear(16, 10)
    def forward(self, x):
        x = self.conv(x)
        x = self.relu(x)
        x = self.pool(x)
        x = torch.flatten(x, 1)
        x = self.fc(x)
        return x

# Assume model is trained
model = SimpleCNN()
model.eval()

# Sample input for tracing
example_input = torch.randn(1, 3, 32, 32)

# Export model using tracing
traced_model = torch.jit.trace(model, example_input)

# Save the traced model
traced_model.save("traced_model.pt")

# Load the traced model
loaded_model = torch.jit.load("traced_model.pt")
loaded_model.eval()

# Verify outputs match
with torch.no_grad():
    original_output = model(example_input)
    traced_output = loaded_model(example_input)

# Check if outputs are close
outputs_match = torch.allclose(original_output, traced_output, atol=1e-6)

print(f"Outputs match: {outputs_match}")
Changed super().__init__() to super(SimpleCNN, self).__init__() for compatibility.
Results Interpretation

Before export: Model runs only in PyTorch with Python dependency.

After export: Model runs as TorchScript module, outputs match original model exactly.

TorchScript export allows running PyTorch models efficiently outside Python while preserving model behavior.
Bonus Experiment
Try exporting the same model using torch.jit.script instead of tracing and compare the results.
💡 Hint
Use torch.jit.script(model) and verify outputs similarly.