0
0
PyTorchml~10 mins

Sequential model shortcut in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a simple sequential model with one linear layer.

PyTorch
import torch.nn as nn
model = nn.Sequential(nn.Linear(10, [1]))
Drag options to blanks, or click blank then click option'
A5
B10
C15
D20
Attempts:
3 left
💡 Hint
Common Mistakes
Using the input size instead of output size for the second argument.
Confusing input and output dimensions.
2fill in blank
medium

Complete the code to add a ReLU activation after the linear layer in the sequential model.

PyTorch
import torch.nn as nn
model = nn.Sequential(
    nn.Linear(10, 5),
    [1]()
)
Drag options to blanks, or click blank then click option'
ASigmoid
BTanh
CSoftmax
DReLU
Attempts:
3 left
💡 Hint
Common Mistakes
Using Sigmoid or Softmax which are less common for hidden layers.
Forgetting to add parentheses after the activation class.
3fill in blank
hard

Fix the error in the code to correctly create a sequential model with two layers.

PyTorch
import torch.nn as nn
model = nn.Sequential(
    nn.Linear(10, 5),
    nn.ReLU(),
    [1](5, 2)
)
Drag options to blanks, or click blank then click option'
AConv2d
BDropout
CLinear
DBatchNorm1d
Attempts:
3 left
💡 Hint
Common Mistakes
Using Conv2d which expects image data.
Using BatchNorm1d or Dropout incorrectly as a layer with input/output sizes.
4fill in blank
hard

Fill both blanks to create a sequential model with two linear layers and a ReLU activation in between.

PyTorch
import torch.nn as nn
model = nn.Sequential(
    nn.[1](10, 8),
    nn.[2](),
    nn.Linear(8, 3)
)
Drag options to blanks, or click blank then click option'
ALinear
BReLU
CSigmoid
DDropout
Attempts:
3 left
💡 Hint
Common Mistakes
Using Sigmoid instead of ReLU for activation.
Using Dropout instead of activation.
5fill in blank
hard

Fill all three blanks to build a sequential model with two linear layers, a ReLU activation, and a dropout layer.

PyTorch
import torch.nn as nn
model = nn.Sequential(
    nn.[1](12, 10),
    nn.[2](),
    nn.[3](p=0.3),
    nn.Linear(10, 4)
)
Drag options to blanks, or click blank then click option'
ALinear
BReLU
CDropout
DBatchNorm1d
Attempts:
3 left
💡 Hint
Common Mistakes
Using BatchNorm1d instead of Dropout for regularization.
Forgetting to add parentheses after the class names.