Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to create a simple sequential model with one linear layer.
PyTorch
import torch.nn as nn model = nn.Sequential(nn.Linear(10, [1]))
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using the input size instead of output size for the second argument.
Confusing input and output dimensions.
✗ Incorrect
The linear layer output size is set to 5, which matches the intended model design.
2fill in blank
mediumComplete the code to add a ReLU activation after the linear layer in the sequential model.
PyTorch
import torch.nn as nn model = nn.Sequential( nn.Linear(10, 5), [1]() )
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using Sigmoid or Softmax which are less common for hidden layers.
Forgetting to add parentheses after the activation class.
✗ Incorrect
ReLU is a common activation function used after linear layers to add non-linearity.
3fill in blank
hardFix the error in the code to correctly create a sequential model with two layers.
PyTorch
import torch.nn as nn model = nn.Sequential( nn.Linear(10, 5), nn.ReLU(), [1](5, 2) )
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using Conv2d which expects image data.
Using BatchNorm1d or Dropout incorrectly as a layer with input/output sizes.
✗ Incorrect
The second linear layer should be nn.Linear to connect 5 inputs to 2 outputs.
4fill in blank
hardFill both blanks to create a sequential model with two linear layers and a ReLU activation in between.
PyTorch
import torch.nn as nn model = nn.Sequential( nn.[1](10, 8), nn.[2](), nn.Linear(8, 3) )
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using Sigmoid instead of ReLU for activation.
Using Dropout instead of activation.
✗ Incorrect
The first blank is 'Linear' to create the first layer, the second blank is 'ReLU' for activation.
5fill in blank
hardFill all three blanks to build a sequential model with two linear layers, a ReLU activation, and a dropout layer.
PyTorch
import torch.nn as nn model = nn.Sequential( nn.[1](12, 10), nn.[2](), nn.[3](p=0.3), nn.Linear(10, 4) )
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using BatchNorm1d instead of Dropout for regularization.
Forgetting to add parentheses after the class names.
✗ Incorrect
The model has a Linear layer, followed by ReLU activation, then Dropout with 0.3 probability.