Challenge - 5 Problems
Layer Initialization Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of a simple PyTorch layer initialization
What is the output of the following code snippet that initializes a linear layer and prints its weight shape?
PyTorch
import torch import torch.nn as nn layer = nn.Linear(10, 5) print(layer.weight.shape)
Attempts:
2 left
💡 Hint
Remember that nn.Linear(in_features, out_features) creates weights of shape (out_features, in_features).
✗ Incorrect
The weight matrix in nn.Linear has shape (out_features, in_features). Here, in_features=10 and out_features=5, so the shape is (5, 10).
❓ Model Choice
intermediate2:00remaining
Choosing the correct __init__ signature for a convolutional layer
Which of the following __init__ method signatures correctly initializes a 2D convolutional layer with 3 input channels, 16 output channels, and a kernel size of 3?
Attempts:
2 left
💡 Hint
Conv2d expects (in_channels, out_channels, kernel_size).
✗ Incorrect
nn.Conv2d takes input channels first, then output channels, then kernel size. Option D matches this order with correct values.
❓ Hyperparameter
advanced2:00remaining
Effect of bias parameter in layer initialization
In PyTorch, when initializing a linear layer with nn.Linear(10, 5, bias=False), what is the effect of setting bias=False?
Attempts:
2 left
💡 Hint
Bias controls whether an extra constant term is added to the output.
✗ Incorrect
Setting bias=False disables the bias term, so the layer output is just a weighted sum without an added constant.
🔧 Debug
advanced2:00remaining
Identify the error in this layer __init__ method
What error will occur when running this code snippet?
PyTorch
import torch.nn as nn class MyModel(nn.Module): def __init__(self): super().__init__() self.linear = nn.Linear(10, 5)
Attempts:
2 left
💡 Hint
In PyTorch, always call super().__init__() before defining layers.
✗ Incorrect
super().__init__() must be called before defining layers to properly initialize the base class. Calling it after causes attribute errors.
🧠 Conceptual
expert3:00remaining
Understanding parameter registration in __init__
Why is it important to define layers as attributes inside the __init__ method of a PyTorch nn.Module subclass?
Attempts:
2 left
💡 Hint
Think about how PyTorch tracks parameters for optimization.
✗ Incorrect
PyTorch registers parameters only if they are assigned as attributes inside __init__. This allows optimizer to find and update them.