Challenge - 5 Problems
Flatten Layer Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output shape after Flatten layer in PyTorch
Given the following PyTorch code, what is the shape of the tensor after applying the Flatten layer?
PyTorch
import torch import torch.nn as nn x = torch.randn(10, 3, 28, 28) # batch of 10 images, 3 channels, 28x28 pixels flatten = nn.Flatten() out = flatten(x) print(out.shape)
Attempts:
2 left
💡 Hint
Flatten combines all dimensions except the batch dimension into one.
✗ Incorrect
The Flatten layer keeps the batch size (10) and flattens the remaining dimensions (3*28*28 = 2352) into one dimension.
❓ Model Choice
intermediate2:00remaining
Choosing Flatten layer position in a CNN
In a convolutional neural network (CNN), where should you place the Flatten layer?
Attempts:
2 left
💡 Hint
Flatten prepares data for fully connected layers by converting multi-dimensional data to 2D.
✗ Incorrect
Flatten is used after convolutional and pooling layers to convert the multi-dimensional feature maps into a 2D tensor for fully connected layers.
❓ Hyperparameter
advanced2:00remaining
Effect of start_dim parameter in PyTorch Flatten
What is the effect of setting start_dim=2 in nn.Flatten(start_dim=2) when applied to a tensor of shape (5, 4, 3, 2)?
Attempts:
2 left
💡 Hint
start_dim defines the first dimension to flatten from.
✗ Incorrect
With start_dim=2, dimensions 2 and 3 (3 and 2) are flattened into one dimension (3*2=6), so shape becomes (5, 4, 6).
🔧 Debug
advanced2:00remaining
Debugging Flatten layer usage error
What error will this PyTorch code raise and why?
import torch
import torch.nn as nn
x = torch.randn(8, 3, 32, 32)
flatten = nn.Flatten(start_dim=4)
out = flatten(x)
Attempts:
2 left
💡 Hint
start_dim can equal input.dim(); it flattens nothing.
✗ Incorrect
The tensor has 4 dimensions. start_dim=4 equals input.dim(), so it flattens an empty range (dim 4 to 4), causing no change and no error.
🧠 Conceptual
expert2:00remaining
Why Flatten layer is crucial before fully connected layers
Why is the Flatten layer necessary before feeding data into fully connected (linear) layers in neural networks?
Attempts:
2 left
💡 Hint
Think about the input shape requirements of linear layers.
✗ Incorrect
Fully connected layers expect 2D input where each row is a sample and columns are features. Flatten reshapes multi-dimensional tensors to this format.