Complete the code to add batch normalization after a dense layer in TensorFlow.
model = tf.keras.Sequential([
tf.keras.layers.Dense(64, activation='relu'),
tf.keras.layers.[1](),
tf.keras.layers.Dense(10, activation='softmax')
])The BatchNormalization layer normalizes the outputs of the previous layer, helping training stability.
Complete the code to create a batch normalization layer with momentum 0.9.
bn_layer = tf.keras.layers.BatchNormalization(momentum=[1])The momentum parameter controls the moving average update speed. 0.9 is a common default.
Fix the error in the batch normalization usage by completing the code.
x = tf.keras.layers.Dense(128)(inputs) x = tf.keras.layers.BatchNormalization[1](x)
In TensorFlow, layers are called like functions with parentheses to apply them to inputs.
Fill both blanks to create a batch normalization layer that normalizes over the last axis and uses epsilon 1e-5.
bn = tf.keras.layers.BatchNormalization(axis=[1], epsilon=[2])
Axis -1 means normalize over the last axis (features). Epsilon 1e-5 avoids division by zero.
Fill all three blanks to create a model with batch normalization after each dense layer and dropout after the first batch norm.
model = tf.keras.Sequential([
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.[1](),
tf.keras.layers.[2](0.5),
tf.keras.layers.Dense(64, activation='relu'),
tf.keras.layers.[3](),
tf.keras.layers.Dense(10, activation='softmax')
])BatchNormalization layers normalize activations. Dropout randomly disables neurons to reduce overfitting.