0
0
TensorFlowml~10 mins

Learning rate for fine-tuning in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to set the learning rate for fine-tuning a TensorFlow model.

TensorFlow
optimizer = tf.keras.optimizers.Adam(learning_rate=[1])
Drag options to blanks, or click blank then click option'
A0.1
B0.001
C1
D10
Attempts:
3 left
💡 Hint
Common Mistakes
Using a large learning rate like 0.1 or 1 causes unstable training.
2fill in blank
medium

Complete the code to compile the model with the Adam optimizer using the correct learning rate for fine-tuning.

TensorFlow
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=[1]), loss='categorical_crossentropy', metrics=['accuracy'])
Drag options to blanks, or click blank then click option'
A0.1
B1
C0.01
D0.0001
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing a learning rate too large, which can cause the model to forget learned features.
3fill in blank
hard

Fix the error in the code to correctly set the learning rate for fine-tuning.

TensorFlow
optimizer = tf.keras.optimizers.Adam(learning_rate=[1])
Drag options to blanks, or click blank then click option'
A1
B'0.001'
C0.001
D0.01
Attempts:
3 left
💡 Hint
Common Mistakes
Putting the learning rate value inside quotes, making it a string.
4fill in blank
hard

Fill both blanks to create a learning rate schedule for fine-tuning that starts at 0.001 and decays by 0.1 every 10 epochs.

TensorFlow
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(initial_learning_rate=[1], decay_steps=[2], decay_rate=0.1)
Drag options to blanks, or click blank then click option'
A0.001
B1000
C10
D0.01
Attempts:
3 left
💡 Hint
Common Mistakes
Using decay_steps as 10 instead of a larger number representing steps.
5fill in blank
hard

Fill all three blanks to define an Adam optimizer with a learning rate schedule starting at 0.0005, decaying every 500 steps by 0.5.

TensorFlow
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(initial_learning_rate=[1], decay_steps=[2], decay_rate=[3])
optimizer = tf.keras.optimizers.Adam(learning_rate=lr_schedule)
Drag options to blanks, or click blank then click option'
A0.0005
B500
C0.5
D0.05
Attempts:
3 left
💡 Hint
Common Mistakes
Using decay rate greater than 1 which would increase learning rate.