0
0
PyTorchml~20 mins

Generator and discriminator in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
GAN Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of a simple Generator forward pass
Given the following PyTorch Generator model and input noise tensor, what is the shape of the output tensor after the forward pass?
PyTorch
import torch
import torch.nn as nn

class Generator(nn.Module):
    def __init__(self):
        super().__init__()
        self.net = nn.Sequential(
            nn.Linear(100, 256),
            nn.ReLU(),
            nn.Linear(256, 784),
            nn.Tanh()
        )

    def forward(self, x):
        return self.net(x)

G = Generator()
noise = torch.randn(16, 100)
output = G(noise)
output_shape = output.shape
A(16, 784)
B(100, 256)
C(16, 256)
D(784, 16)
Attempts:
2 left
💡 Hint
The Generator transforms noise vectors of size 100 into images flattened to 784 pixels.
Model Choice
intermediate
2:00remaining
Choosing the correct Discriminator output layer
Which of the following PyTorch Discriminator output layers is correct for producing a single probability score for real/fake classification?
Ann.Linear(512, 2) followed by nn.LogSoftmax(dim=1)
Bnn.Linear(512, 10) followed by nn.Softmax(dim=1)
Cnn.Linear(512, 1) with no activation
Dnn.Linear(512, 1) followed by nn.Sigmoid()
Attempts:
2 left
💡 Hint
The Discriminator outputs a probability between 0 and 1 for real or fake.
Hyperparameter
advanced
2:00remaining
Effect of learning rate on GAN training stability
In training a GAN, what is the most likely effect of setting the learning rate too high for both Generator and Discriminator?
ATraining becomes unstable with oscillations and mode collapse
BTraining converges faster and produces better images
CThe Discriminator ignores fake images and outputs constant values
DThe Generator stops learning but the Discriminator improves steadily
Attempts:
2 left
💡 Hint
High learning rates can cause the model weights to jump too much during updates.
Metrics
advanced
2:00remaining
Choosing the right loss function for GAN Discriminator
Which loss function is commonly used for the Discriminator in a vanilla GAN to distinguish real from fake images?
AHinge Loss
BMean Squared Error Loss
CBinary Cross Entropy Loss
DCategorical Cross Entropy Loss
Attempts:
2 left
💡 Hint
The Discriminator outputs a probability for two classes: real or fake.
🔧 Debug
expert
2:00remaining
Identifying the cause of NaN loss in GAN training
During GAN training, the Generator loss suddenly becomes NaN. Which of the following is the most likely cause?
AThe batch size is too large causing memory overflow
BThe Discriminator outputs values outside the range [0,1] causing log(0) in loss
CThe Generator uses ReLU activation in the output layer instead of Tanh
DThe optimizer learning rate is set to zero
Attempts:
2 left
💡 Hint
Check if the Discriminator output is properly squashed before computing log loss.