Challenge - 5 Problems
TorchScript Export Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
What is the output of this TorchScript export code?
Consider this PyTorch model and its TorchScript export. What will be printed after running the code?
PyTorch
import torch import torch.nn as nn class SimpleModel(nn.Module): def forward(self, x): return x * 2 model = SimpleModel() scripted_model = torch.jit.script(model) print(scripted_model(torch.tensor(3)))
Attempts:
2 left
💡 Hint
Think about what the model does to the input tensor.
✗ Incorrect
The model multiplies the input by 2. Input is 3, so output is 6 wrapped in a tensor.
❓ Model Choice
intermediate2:00remaining
Which model can be successfully exported with torch.jit.script?
Given these PyTorch models, which one will torch.jit.script successfully export without errors?
PyTorch
import torch import torch.nn as nn class ModelA(nn.Module): def forward(self, x): if x.sum() > 0: return x * 2 else: return x - 2 class ModelB(nn.Module): def forward(self, x): return x.numpy() + 1 class ModelC(nn.Module): def forward(self, x): return x + 1 class ModelD(nn.Module): def forward(self, x): return x.tolist()
Attempts:
2 left
💡 Hint
TorchScript does not support converting tensors to numpy or list inside scripted models.
✗ Incorrect
ModelA uses only PyTorch operations and control flow supported by TorchScript. ModelB and ModelD use unsupported tensor conversions. ModelC is simple but does not have control flow, so it also works but ModelA is the best example with control flow.
❓ Hyperparameter
advanced2:00remaining
Which option correctly sets optimization for TorchScript export?
You want to export a PyTorch model with torch.jit.script and enable optimization passes. Which code snippet correctly does this?
Attempts:
2 left
💡 Hint
Optimization is applied after scripting using a separate function.
✗ Incorrect
torch.jit.script does not take an optimize argument. Optimization is done by passing the scripted model to torch.jit.optimize_for_inference which returns an optimized model.
🔧 Debug
advanced2:00remaining
What error does this TorchScript export code raise?
What error will this code raise when trying to export the model with torch.jit.script?
PyTorch
import torch import torch.nn as nn class BuggyModel(nn.Module): def forward(self, x): y = x.numpy() + 1 return torch.tensor(y) model = BuggyModel() scripted = torch.jit.script(model)
Attempts:
2 left
💡 Hint
TorchScript does not allow tensor.numpy() calls inside scripted models.
✗ Incorrect
Calling .numpy() on a tensor inside a scripted model is not supported and causes a runtime error during scripting.
🧠 Conceptual
expert2:00remaining
Why use torch.jit.trace instead of torch.jit.script for exporting a model?
Which reason best explains when torch.jit.trace is preferred over torch.jit.script for exporting PyTorch models?
Attempts:
2 left
💡 Hint
Tracing records operations from example inputs, ignoring control flow.
✗ Incorrect
torch.jit.trace records the operations executed on example inputs and does not capture Python control flow, so it is preferred when the model has complex control flow that scripting cannot parse.