0
0
TensorFlowml~10 mins

Weight initialization strategies in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to initialize weights with zeros in TensorFlow.

TensorFlow
initializer = tf.keras.initializers.[1]()
Drag options to blanks, or click blank then click option'
AHeNormal
BZeros
CGlorotUniform
DRandomNormal
Attempts:
3 left
💡 Hint
Common Mistakes
Using RandomNormal instead of Zeros will initialize weights randomly, not zeros.
GlorotUniform and HeNormal are advanced initializers, not zeros.
2fill in blank
medium

Complete the code to initialize weights using He normal initialization in TensorFlow.

TensorFlow
initializer = tf.keras.initializers.[1]()
Drag options to blanks, or click blank then click option'
ARandomUniform
BGlorotUniform
CZeros
DHeNormal
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing GlorotUniform instead of HeNormal may reduce performance for ReLU layers.
RandomUniform does not scale weights based on layer size.
3fill in blank
hard

Fix the error in the code to use Xavier (Glorot) uniform initialization for a Dense layer.

TensorFlow
layer = tf.keras.layers.Dense(64, kernel_initializer=[1])
Drag options to blanks, or click blank then click option'
A'glorot_uniform'
B'random_normal'
C'he_normal'
D'zeros'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'he_normal' instead of 'glorot_uniform' changes the initialization strategy.
Using 'zeros' will initialize all weights to zero, which is not Xavier initialization.
4fill in blank
hard

Fill both blanks to create a Dense layer with He uniform initializer and ReLU activation.

TensorFlow
layer = tf.keras.layers.Dense(128, kernel_initializer=[1], activation=[2])
Drag options to blanks, or click blank then click option'
A'he_uniform'
B'relu'
C'sigmoid'
D'glorot_normal'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'sigmoid' activation with He uniform initializer is less common and may reduce performance.
Using 'glorot_normal' initializer with ReLU activation is valid but not the best match here.
5fill in blank
hard

Fill all three blanks to define a Sequential model with two Dense layers using Xavier initialization and softmax activation.

TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.Dense(64, kernel_initializer=[1], activation=[2]),
    tf.keras.layers.Dense(10, kernel_initializer=[3], activation='softmax')
])
Drag options to blanks, or click blank then click option'
A'glorot_uniform'
B'relu'
C'glorot_normal'
D'he_uniform'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'he_uniform' initializer in output layer is less common for softmax outputs.
Using 'relu' activation in output layer instead of 'softmax' is incorrect for classification.