Challenge - 5 Problems
TorchScript Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of TorchScript tracing with control flow
Consider the following PyTorch model and its TorchScript tracing. What will be the output when running
scripted_model(torch.tensor(3))?PyTorch
import torch import torch.nn as nn class MyModel(nn.Module): def forward(self, x): if x.item() > 2: return x * 2 else: return x + 2 model = MyModel() scripted_model = torch.jit.trace(model, torch.tensor(1)) output = scripted_model(torch.tensor(3)) print(output.item())
Attempts:
2 left
💡 Hint
Tracing records operations for the example input. Think about what happens when input changes.
✗ Incorrect
Tracing records the operations executed with the example input (1). Since 1 <= 2, the else branch (x + 2) is recorded. When input 3 is passed, the traced model still executes x + 2, returning 5.
❓ Model Choice
intermediate1:30remaining
Choosing between tracing and scripting for dynamic control flow
You have a PyTorch model with dynamic control flow depending on input values. Which TorchScript method should you use to correctly convert the model for production?
Attempts:
2 left
💡 Hint
Tracing records operations for fixed inputs, scripting analyzes code.
✗ Incorrect
Tracing records operations executed for example inputs and cannot capture dynamic control flow. Scripting analyzes the model code and supports dynamic control flow, making it suitable for such models.
❓ Hyperparameter
advanced1:30remaining
Optimizing TorchScript model for inference speed
Which of the following options is the best practice to improve inference speed of a TorchScript model in production?
Attempts:
2 left
💡 Hint
Freezing removes unnecessary parts and optimizes the graph.
✗ Incorrect
torch.jit.freeze optimizes the scripted model by removing unused attributes and enabling further graph optimizations, improving inference speed. Increasing batch size or adding dropout does not affect TorchScript inference speed. Tracing with multiple inputs is not supported.
🔧 Debug
advanced2:00remaining
Identifying error in TorchScript scripted model
Given this scripted model code, what error will it raise when running
scripted_model(torch.tensor([1,2,3]))?PyTorch
import torch import torch.nn as nn class Model(nn.Module): def forward(self, x): return x[0] + x[3] model = Model() scripted_model = torch.jit.script(model) output = scripted_model(torch.tensor([1,2,3]))
Attempts:
2 left
💡 Hint
Check the tensor size and indexing carefully.
✗ Incorrect
The input tensor has size 3, so index 3 is out of bounds causing IndexError. TorchScript supports tensor indexing, and the addition is valid, so other errors do not occur.
🧠 Conceptual
expert2:30remaining
Understanding TorchScript serialization and deployment
Which statement about TorchScript models saved with
torch.jit.save and loaded with torch.jit.load is TRUE?Attempts:
2 left
💡 Hint
Think about portability and self-containment of TorchScript files.
✗ Incorrect
TorchScript saves the model code and parameters together, allowing loading and running without Python source code. The model can be loaded on different devices, and optimizations are preserved.