Complete the code to load a pre-trained model for fine-tuning.
base_model = tf.keras.applications.MobileNetV2(input_shape=(224, 224, 3), include_top=False, weights=[1])
The pre-trained weights from ImageNet are loaded by setting weights='imagenet'. This helps the model start with learned features.
Complete the code to freeze the base model layers during fine-tuning.
base_model.[1] = False
Setting base_model.trainable = False freezes the layers so they are not updated during training.
Fix the error in the code to add a global average pooling layer after the base model.
x = base_model.output
x = tf.keras.layers.[1]()(x)GlobalAveragePooling2D reduces each feature map to a single value by averaging, which is common after convolutional base models.
Fill both blanks to compile the fine-tuned model with an optimizer and loss function.
model.compile(optimizer=tf.keras.optimizers.[1](learning_rate=0.0001), loss='[2]', metrics=['accuracy'])
Adam optimizer is popular for fine-tuning with a small learning rate. For multi-class classification, 'categorical_crossentropy' is used as loss.
Fill all three blanks to create a dictionary of training callbacks for early stopping and saving the best model.
callbacks = [
tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=[1]),
tf.keras.callbacks.ModelCheckpoint(filepath='best_model.h5', save_best_only=[2], monitor='[3]')
]EarlyStopping patience is set to 3 epochs. ModelCheckpoint saves only the best model based on validation loss.