0
0
TensorFlowml~20 mins

Freezing and unfreezing layers in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Freezing Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of model training with frozen layers
Consider a TensorFlow model where the first two layers are frozen before training. What will be the effect on the training loss after one epoch?
TensorFlow
import tensorflow as tf
from tensorflow.keras import layers, models

model = models.Sequential([
    layers.Dense(10, activation='relu', input_shape=(5,)),
    layers.Dense(10, activation='relu'),
    layers.Dense(1)
])

# Freeze first two layers
for layer in model.layers[:2]:
    layer.trainable = False

model.compile(optimizer='adam', loss='mse')

import numpy as np
x = np.random.random((100, 5))
y = np.random.random((100, 1))

history = model.fit(x, y, epochs=1, verbose=0)

print(round(history.history['loss'][0], 3))
AAn error due to missing input shape specification
BA float value less than 1.0 representing the training loss
CA TypeError because frozen layers cannot be trained
DA float value greater than 10.0 representing the training loss
Attempts:
2 left
💡 Hint
Freezing layers means their weights do not update, but the model can still train other layers.
Model Choice
intermediate
1:30remaining
Choosing layers to unfreeze for fine-tuning
You have a pretrained convolutional neural network for image classification. To fine-tune it on a new dataset, which layers should you unfreeze?
ANo layers; keep all frozen
BAll layers including the input layer
COnly the last few layers to adapt high-level features
DOnly the first convolutional layer
Attempts:
2 left
💡 Hint
High-level features are learned in deeper layers.
Hyperparameter
advanced
1:30remaining
Effect of freezing layers on learning rate choice
When you freeze most layers of a pretrained model and only train a few layers, how should you adjust the learning rate?
AUse a larger learning rate to speed up training
BLearning rate does not matter when layers are frozen
CKeep the learning rate the same as training from scratch
DUse a smaller learning rate to avoid large updates on few trainable layers
Attempts:
2 left
💡 Hint
Few trainable layers need careful updates to avoid destroying pretrained weights.
🔧 Debug
advanced
2:00remaining
Why does unfreezing layers not update weights?
You unfreeze layers in a TensorFlow model by setting layer.trainable = True, but after training, weights do not change. What is the likely cause?
AThe model was not recompiled after changing trainable flags
BThe optimizer does not support training unfrozen layers
CThe dataset is empty, so no training occurs
DTensorFlow does not allow unfreezing layers after compilation
Attempts:
2 left
💡 Hint
Changing trainable flags requires recompiling the model.
🧠 Conceptual
expert
2:30remaining
Impact of freezing layers on model capacity and generalization
Freezing layers in a pretrained model affects its capacity and generalization. Which statement best describes this impact?
AFreezing layers reduces model capacity but can improve generalization on small datasets
BFreezing layers increases model capacity and always improves generalization
CFreezing layers has no effect on capacity or generalization
DFreezing layers reduces generalization by limiting learning
Attempts:
2 left
💡 Hint
Think about how freezing limits learning but can prevent overfitting.