0
0
TensorFlowml~10 mins

Batch normalization in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to add batch normalization after a dense layer in TensorFlow.

TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.Dense(64, activation='relu'),
    tf.keras.layers.[1](),
    tf.keras.layers.Dense(10, activation='softmax')
])
Drag options to blanks, or click blank then click option'
ABatchNormalization
BDropout
CFlatten
DConv2D
Attempts:
3 left
💡 Hint
Common Mistakes
Using Dropout instead of BatchNormalization.
Forgetting to add batch normalization after the dense layer.
2fill in blank
medium

Complete the code to create a batch normalization layer with momentum 0.9.

TensorFlow
bn_layer = tf.keras.layers.BatchNormalization(momentum=[1])
Drag options to blanks, or click blank then click option'
A0.5
B0.1
C1.0
D0.9
Attempts:
3 left
💡 Hint
Common Mistakes
Setting momentum to 1.0 disables moving average updates.
Using too low momentum like 0.1 can make training unstable.
3fill in blank
hard

Fix the error in the batch normalization usage by completing the code.

TensorFlow
x = tf.keras.layers.Dense(128)(inputs)
x = tf.keras.layers.BatchNormalization[1](x)
Drag options to blanks, or click blank then click option'
A.apply
B()
C.call
D.fit
Attempts:
3 left
💡 Hint
Common Mistakes
Using .call or .apply instead of calling the layer as a function.
Forgetting parentheses causes a syntax error.
4fill in blank
hard

Fill both blanks to create a batch normalization layer that normalizes over the last axis and uses epsilon 1e-5.

TensorFlow
bn = tf.keras.layers.BatchNormalization(axis=[1], epsilon=[2])
Drag options to blanks, or click blank then click option'
A-1
B0
C1e-5
D1e-3
Attempts:
3 left
💡 Hint
Common Mistakes
Using axis 0 normalizes over batch dimension, which is incorrect.
Using too large epsilon like 1e-3 can reduce normalization effectiveness.
5fill in blank
hard

Fill all three blanks to create a model with batch normalization after each dense layer and dropout after the first batch norm.

TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.Dense(128, activation='relu'),
    tf.keras.layers.[1](),
    tf.keras.layers.[2](0.5),
    tf.keras.layers.Dense(64, activation='relu'),
    tf.keras.layers.[3](),
    tf.keras.layers.Dense(10, activation='softmax')
])
Drag options to blanks, or click blank then click option'
ABatchNormalization
BDropout
DFlatten
Attempts:
3 left
💡 Hint
Common Mistakes
Placing dropout before batch normalization.
Forgetting to add batch normalization after the second dense layer.