Complete the code to set the learning rate for fine-tuning a TensorFlow model.
optimizer = tf.keras.optimizers.Adam(learning_rate=[1])The learning rate for fine-tuning is usually small, like 0.001, to avoid large updates that could harm the pretrained weights.
Complete the code to compile the model with the Adam optimizer using the correct learning rate for fine-tuning.
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=[1]), loss='categorical_crossentropy', metrics=['accuracy'])
For fine-tuning, a very small learning rate like 0.0001 helps to gently adjust the pretrained model.
Fix the error in the code to correctly set the learning rate for fine-tuning.
optimizer = tf.keras.optimizers.Adam(learning_rate=[1])The learning rate must be a float number, not a string. So 0.001 is correct, not '0.001'.
Fill both blanks to create a learning rate schedule for fine-tuning that starts at 0.001 and decays by 0.1 every 10 epochs.
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(initial_learning_rate=[1], decay_steps=[2], decay_rate=0.1)
The initial learning rate is 0.001. Decay steps are set to 1000 (assuming 100 steps per epoch * 10 epochs) to decay every 10 epochs.
Fill all three blanks to define an Adam optimizer with a learning rate schedule starting at 0.0005, decaying every 500 steps by 0.5.
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(initial_learning_rate=[1], decay_steps=[2], decay_rate=[3]) optimizer = tf.keras.optimizers.Adam(learning_rate=lr_schedule)
The initial learning rate is 0.0005, decay steps 500, and decay rate 0.5 to reduce the learning rate by half every 500 steps.