Challenge - 5 Problems
Freezing Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of model training with frozen layers
Consider a TensorFlow model where the first two layers are frozen before training. What will be the effect on the training loss after one epoch?
TensorFlow
import tensorflow as tf from tensorflow.keras import layers, models model = models.Sequential([ layers.Dense(10, activation='relu', input_shape=(5,)), layers.Dense(10, activation='relu'), layers.Dense(1) ]) # Freeze first two layers for layer in model.layers[:2]: layer.trainable = False model.compile(optimizer='adam', loss='mse') import numpy as np x = np.random.random((100, 5)) y = np.random.random((100, 1)) history = model.fit(x, y, epochs=1, verbose=0) print(round(history.history['loss'][0], 3))
Attempts:
2 left
💡 Hint
Freezing layers means their weights do not update, but the model can still train other layers.
✗ Incorrect
Freezing layers sets their trainable attribute to False, so their weights stay fixed. The model still trains the unfrozen layers, so training loss decreases and is a float value less than 1.0 after one epoch on random data.
❓ Model Choice
intermediate1:30remaining
Choosing layers to unfreeze for fine-tuning
You have a pretrained convolutional neural network for image classification. To fine-tune it on a new dataset, which layers should you unfreeze?
Attempts:
2 left
💡 Hint
High-level features are learned in deeper layers.
✗ Incorrect
Unfreezing only the last few layers allows the model to adapt to new data while keeping basic features intact. Unfreezing all layers may cause overfitting or require more data and time.
❓ Hyperparameter
advanced1:30remaining
Effect of freezing layers on learning rate choice
When you freeze most layers of a pretrained model and only train a few layers, how should you adjust the learning rate?
Attempts:
2 left
💡 Hint
Few trainable layers need careful updates to avoid destroying pretrained weights.
✗ Incorrect
A smaller learning rate helps fine-tune the few trainable layers gently, preserving learned features and improving stability.
🔧 Debug
advanced2:00remaining
Why does unfreezing layers not update weights?
You unfreeze layers in a TensorFlow model by setting layer.trainable = True, but after training, weights do not change. What is the likely cause?
Attempts:
2 left
💡 Hint
Changing trainable flags requires recompiling the model.
✗ Incorrect
In TensorFlow, after changing layer.trainable, you must recompile the model to update the trainable variables list. Otherwise, training ignores the change.
🧠 Conceptual
expert2:30remaining
Impact of freezing layers on model capacity and generalization
Freezing layers in a pretrained model affects its capacity and generalization. Which statement best describes this impact?
Attempts:
2 left
💡 Hint
Think about how freezing limits learning but can prevent overfitting.
✗ Incorrect
Freezing layers reduces the number of trainable parameters, lowering capacity. This can help generalization on small datasets by preventing overfitting.