0
0
TensorFlowml~20 mins

Learning rate scheduling in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Learning Rate Scheduling Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
What is the main purpose of learning rate scheduling in training neural networks?

Why do we use learning rate scheduling when training a neural network?

ATo gradually reduce the learning rate to help the model converge better and avoid overshooting minima.
BTo increase the learning rate over time to speed up training indefinitely.
CTo keep the learning rate constant throughout training for stable updates.
DTo randomly change the learning rate each epoch to add noise and prevent overfitting.
Attempts:
2 left
💡 Hint

Think about how changing the learning rate affects the model's ability to find the best solution.

Predict Output
intermediate
2:00remaining
Output of learning rate after 5 epochs with ExponentialDecay

Given this TensorFlow learning rate schedule, what is the learning rate at epoch 5?

TensorFlow
import tensorflow as tf
initial_lr = 0.1
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
    initial_learning_rate=initial_lr,
    decay_steps=2,
    decay_rate=0.5,
    staircase=True
)
learning_rate_epoch_5 = lr_schedule(5).numpy()
print(round(learning_rate_epoch_5, 4))
A0.1
B0.0125
C0.05
D0.025
Attempts:
2 left
💡 Hint

Remember that with staircase=True, the learning rate changes only at multiples of decay_steps.

Model Choice
advanced
2:00remaining
Which learning rate schedule is best for training a model that needs slow fine-tuning after initial fast learning?

You want a learning rate schedule that starts high and then slowly decreases to fine-tune the model. Which schedule fits best?

ACosine annealing schedule that gradually reduces learning rate following a cosine curve.
BConstant learning rate with no changes during training.
CStep decay schedule that drops learning rate sharply at fixed intervals.
DRandom learning rate changes each epoch.
Attempts:
2 left
💡 Hint

Think about smooth gradual decrease versus sudden drops.

Hyperparameter
advanced
2:00remaining
Effect of increasing decay_steps in ExponentialDecay schedule

In TensorFlow's ExponentialDecay schedule, what happens if you increase the decay_steps parameter while keeping others constant?

AThe learning rate decays faster over time.
BThe learning rate decays more slowly over time.
CThe learning rate stays constant and does not decay.
DThe learning rate increases over time.
Attempts:
2 left
💡 Hint

Think about how often the decay happens with bigger decay_steps.

Metrics
expert
2:00remaining
How does learning rate scheduling affect training loss and accuracy curves?

When using a learning rate schedule that reduces the learning rate over epochs, what typical pattern do you expect to see in training loss and accuracy graphs?

ATraining loss increases and accuracy decreases as learning rate lowers.
BTraining loss and accuracy fluctuate wildly with no clear trend.
CTraining loss decreases smoothly and accuracy increases steadily, with smaller improvements as learning rate lowers.
DTraining loss and accuracy remain constant throughout training.
Attempts:
2 left
💡 Hint

Consider how smaller learning rates affect model updates and convergence.