Challenge - 5 Problems
GAN Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of a simple Generator forward pass
Given the following PyTorch Generator model and input noise tensor, what is the shape of the output tensor after the forward pass?
PyTorch
import torch import torch.nn as nn class Generator(nn.Module): def __init__(self): super().__init__() self.net = nn.Sequential( nn.Linear(100, 256), nn.ReLU(), nn.Linear(256, 784), nn.Tanh() ) def forward(self, x): return self.net(x) G = Generator() noise = torch.randn(16, 100) output = G(noise) output_shape = output.shape
Attempts:
2 left
💡 Hint
The Generator transforms noise vectors of size 100 into images flattened to 784 pixels.
✗ Incorrect
The input noise has batch size 16 and 100 features. The Generator outputs a tensor with shape (batch_size, 784) because the last Linear layer outputs 784 features per sample.
❓ Model Choice
intermediate2:00remaining
Choosing the correct Discriminator output layer
Which of the following PyTorch Discriminator output layers is correct for producing a single probability score for real/fake classification?
Attempts:
2 left
💡 Hint
The Discriminator outputs a probability between 0 and 1 for real or fake.
✗ Incorrect
The Discriminator should output a single value per input, representing the probability of being real. Using nn.Sigmoid() converts the output to a probability between 0 and 1.
❓ Hyperparameter
advanced2:00remaining
Effect of learning rate on GAN training stability
In training a GAN, what is the most likely effect of setting the learning rate too high for both Generator and Discriminator?
Attempts:
2 left
💡 Hint
High learning rates can cause the model weights to jump too much during updates.
✗ Incorrect
Too high learning rates cause unstable training in GANs, leading to oscillations and mode collapse where the Generator produces limited variety.
❓ Metrics
advanced2:00remaining
Choosing the right loss function for GAN Discriminator
Which loss function is commonly used for the Discriminator in a vanilla GAN to distinguish real from fake images?
Attempts:
2 left
💡 Hint
The Discriminator outputs a probability for two classes: real or fake.
✗ Incorrect
Binary Cross Entropy Loss is used for binary classification tasks like real vs fake in vanilla GANs.
🔧 Debug
expert2:00remaining
Identifying the cause of NaN loss in GAN training
During GAN training, the Generator loss suddenly becomes NaN. Which of the following is the most likely cause?
Attempts:
2 left
💡 Hint
Check if the Discriminator output is properly squashed before computing log loss.
✗ Incorrect
If the Discriminator outputs values outside [0,1], the log in Binary Cross Entropy can be log(0) causing NaN loss.