0
0
PyTorchml~20 mins

TorchScript export in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
TorchScript Export Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
What is the output of this TorchScript export code?
Consider this PyTorch model and its TorchScript export. What will be printed after running the code?
PyTorch
import torch
import torch.nn as nn

class SimpleModel(nn.Module):
    def forward(self, x):
        return x * 2

model = SimpleModel()
scripted_model = torch.jit.script(model)
print(scripted_model(torch.tensor(3)))
Atensor(9)
Btensor(3)
Ctensor(6)
DTypeError
Attempts:
2 left
💡 Hint
Think about what the model does to the input tensor.
Model Choice
intermediate
2:00remaining
Which model can be successfully exported with torch.jit.script?
Given these PyTorch models, which one will torch.jit.script successfully export without errors?
PyTorch
import torch
import torch.nn as nn

class ModelA(nn.Module):
    def forward(self, x):
        if x.sum() > 0:
            return x * 2
        else:
            return x - 2

class ModelB(nn.Module):
    def forward(self, x):
        return x.numpy() + 1

class ModelC(nn.Module):
    def forward(self, x):
        return x + 1

class ModelD(nn.Module):
    def forward(self, x):
        return x.tolist()
AModelA
BModelB
CModelC
DModelD
Attempts:
2 left
💡 Hint
TorchScript does not support converting tensors to numpy or list inside scripted models.
Hyperparameter
advanced
2:00remaining
Which option correctly sets optimization for TorchScript export?
You want to export a PyTorch model with torch.jit.script and enable optimization passes. Which code snippet correctly does this?
A
scripted_model = torch.jit.script(model)
scripted_model = torch.jit.optimize_for_inference(scripted_model)
B
scripted_model = torch.jit.script(model)
torch.jit.optimize_for_inference(scripted_model)
Cscripted_model = torch.jit.script(model, optimize=False)
Dscripted_model = torch.jit.script(model, optimize=True)
Attempts:
2 left
💡 Hint
Optimization is applied after scripting using a separate function.
🔧 Debug
advanced
2:00remaining
What error does this TorchScript export code raise?
What error will this code raise when trying to export the model with torch.jit.script?
PyTorch
import torch
import torch.nn as nn

class BuggyModel(nn.Module):
    def forward(self, x):
        y = x.numpy() + 1
        return torch.tensor(y)

model = BuggyModel()
scripted = torch.jit.script(model)
ASyntaxError: invalid syntax
BRuntimeError: Cannot convert tensor to numpy inside TorchScript
CTypeError: 'Tensor' object is not callable
DNo error, exports successfully
Attempts:
2 left
💡 Hint
TorchScript does not allow tensor.numpy() calls inside scripted models.
🧠 Conceptual
expert
2:00remaining
Why use torch.jit.trace instead of torch.jit.script for exporting a model?
Which reason best explains when torch.jit.trace is preferred over torch.jit.script for exporting PyTorch models?
AWhen the model is purely functional with no control flow
BWhen you want to optimize the model for inference automatically
CWhen you want to export a model without running it on example inputs
DWhen the model contains complex Python control flow that torch.jit.script cannot handle
Attempts:
2 left
💡 Hint
Tracing records operations from example inputs, ignoring control flow.