0
0
PyTorchml~10 mins

CNN architecture for image classification in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to define a convolutional layer in PyTorch.

PyTorch
conv1 = nn.Conv2d(in_channels=3, out_channels=16, kernel_size=[1], stride=1, padding=1)
Drag options to blanks, or click blank then click option'
A5
B7
C3
D1
Attempts:
3 left
💡 Hint
Common Mistakes
Using a kernel size too large can increase computation unnecessarily.
Using kernel size 1 reduces the receptive field.
2fill in blank
medium

Complete the code to add a max pooling layer after the convolution.

PyTorch
pool = nn.MaxPool2d(kernel_size=[1], stride=2)
Drag options to blanks, or click blank then click option'
A4
B3
C1
D2
Attempts:
3 left
💡 Hint
Common Mistakes
Using kernel size 1 does not reduce dimensions.
Using stride different from kernel size can cause overlapping pooling.
3fill in blank
hard

Fix the error in the forward method to apply ReLU activation after convolution.

PyTorch
def forward(self, x):
    x = self.conv1(x)
    x = nn.[1]()(x)
    x = self.pool(x)
    return x
Drag options to blanks, or click blank then click option'
AReLU
BSigmoid
CSoftmax
DTanh
Attempts:
3 left
💡 Hint
Common Mistakes
Using Softmax or Sigmoid here is incorrect as they are for output layers.
Tanh can be used but ReLU is preferred for CNNs.
4fill in blank
hard

Fill both blanks to flatten the tensor and pass it to a fully connected layer.

PyTorch
x = x.[1](x.size(0), -1)
x = self.[2](x)
Drag options to blanks, or click blank then click option'
Aview
Bfc
Creshape
Dlinear
Attempts:
3 left
💡 Hint
Common Mistakes
Using reshape instead of view can work but view is more common in PyTorch.
Calling linear directly without defining it causes errors.
5fill in blank
hard

Fill all three blanks to define the final output layer with correct input size and activation.

PyTorch
self.fc = nn.Linear([1], [2])
output = self.fc(x)
output = nn.[3]()(output)
Drag options to blanks, or click blank then click option'
A128
B10
CLogSoftmax
DReLU
Attempts:
3 left
💡 Hint
Common Mistakes
Using ReLU in the output layer is incorrect for classification.
Incorrect input size causes shape mismatch errors.