Recall & Review
beginner
What is batch normalization in machine learning?
Batch normalization is a technique to make training faster and more stable by normalizing the inputs of each layer. It adjusts and scales the data so the model learns better.
Click to reveal answer
intermediate
Why do we use batch normalization during training?
We use batch normalization to reduce the problem of internal covariate shift, which means the distribution of inputs to layers changes during training. This helps the model train faster and generalize better.
Click to reveal answer
intermediate
Which two parameters does batch normalization learn during training?
Batch normalization learns two parameters: gamma (scale) and beta (shift). These allow the model to restore the original distribution if needed after normalization.
Click to reveal answer
advanced
How does batch normalization behave differently during training and inference?
During training, batch normalization uses the mean and variance of the current batch. During inference, it uses a moving average of mean and variance collected during training for stable predictions.
Click to reveal answer
beginner
Show a simple TensorFlow code snippet to add batch normalization after a dense layer.
import tensorflow as tf
model = tf.keras.Sequential([
tf.keras.layers.Dense(64, activation='relu'),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.Dense(10, activation='softmax')
])Click to reveal answer
What problem does batch normalization mainly address?
✗ Incorrect
Batch normalization reduces internal covariate shift by normalizing layer inputs.
Which parameters are learned by batch normalization?
✗ Incorrect
Batch normalization learns gamma (scale) and beta (shift) parameters.
During inference, batch normalization uses:
✗ Incorrect
Inference uses moving averages collected during training for stable normalization.
Where is batch normalization usually applied in a neural network?
✗ Incorrect
Batch normalization is typically applied after linear layers and before activation.
Which of these is a benefit of batch normalization?
✗ Incorrect
Batch normalization allows using higher learning rates and speeds up training.
Explain in your own words what batch normalization does and why it helps training.
Think about how changing input data affects learning and how normalization fixes it.
You got /4 concepts.
Describe how batch normalization behaves differently during training and inference phases.
Consider what data is available during training vs when the model is used.
You got /4 concepts.