0
0
TensorFlowml~10 mins

Learning rate scheduling in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create an exponential decay learning rate schedule.

TensorFlow
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(initial_learning_rate=0.1, decay_steps=100000, decay_rate=[1], staircase=True)
Drag options to blanks, or click blank then click option'
A2.0
B0.5
C1.5
D0.96
Attempts:
3 left
💡 Hint
Common Mistakes
Using a decay rate greater than 1 which increases the learning rate instead of decreasing it.
Using 0 which would immediately reduce the learning rate to zero.
2fill in blank
medium

Complete the code to apply the learning rate schedule to the Adam optimizer.

TensorFlow
optimizer = tf.keras.optimizers.Adam(learning_rate=[1])
Drag options to blanks, or click blank then click option'
Alr_schedule
B0.001
C0.01
D0.1
Attempts:
3 left
💡 Hint
Common Mistakes
Passing a fixed float value instead of the schedule object.
Passing a string instead of the variable.
3fill in blank
hard

Fix the error in the code to correctly create a piecewise constant learning rate schedule.

TensorFlow
boundaries = [10000, 20000]
values = [0.1, 0.01, [1]]
lr_schedule = tf.keras.optimizers.schedules.PiecewiseConstantDecay(boundaries, values)
Drag options to blanks, or click blank then click option'
A0.05
B0.1
C0.001
D0
Attempts:
3 left
💡 Hint
Common Mistakes
Using the same number of values as boundaries which causes an error.
Using a value larger than previous learning rates.
4fill in blank
hard

Fill both blanks to create a cosine decay learning rate schedule with restarts.

TensorFlow
lr_schedule = tf.keras.experimental.CosineDecayRestarts(initial_learning_rate=[1], first_decay_steps=[2])
Drag options to blanks, or click blank then click option'
A0.05
B1000
C500
D0.1
Attempts:
3 left
💡 Hint
Common Mistakes
Using too large or too small initial learning rate.
Setting first_decay_steps to zero or a negative number.
5fill in blank
hard

Fill all three blanks to create a learning rate schedule that warms up linearly then decays exponentially.

TensorFlow
def lr_warmup_decay(step):
    warmup_steps = [1]
    if step < warmup_steps:
        return [2] * step / warmup_steps
    else:
        return [3] * tf.math.exp(-0.1 * (step - warmup_steps))
Drag options to blanks, or click blank then click option'
A1000
B0.01
C0.001
D0.1
Attempts:
3 left
💡 Hint
Common Mistakes
Confusing warmup_steps with learning rate values.
Using decay base larger than warmup max learning rate.