Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to initialize weights with zeros in TensorFlow.
TensorFlow
initializer = tf.keras.initializers.[1]() Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using RandomNormal instead of Zeros will initialize weights randomly, not zeros.
GlorotUniform and HeNormal are advanced initializers, not zeros.
✗ Incorrect
The Zeros initializer sets all weights to zero, which is a simple weight initialization strategy.
2fill in blank
mediumComplete the code to initialize weights using He normal initialization in TensorFlow.
TensorFlow
initializer = tf.keras.initializers.[1]() Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing GlorotUniform instead of HeNormal may reduce performance for ReLU layers.
RandomUniform does not scale weights based on layer size.
✗ Incorrect
HeNormal initializer is designed for layers with ReLU activation and initializes weights with a normal distribution scaled by the number of input units.
3fill in blank
hardFix the error in the code to use Xavier (Glorot) uniform initialization for a Dense layer.
TensorFlow
layer = tf.keras.layers.Dense(64, kernel_initializer=[1])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'he_normal' instead of 'glorot_uniform' changes the initialization strategy.
Using 'zeros' will initialize all weights to zero, which is not Xavier initialization.
✗ Incorrect
The correct string identifier for Xavier initialization in TensorFlow is 'glorot_uniform'.
4fill in blank
hardFill both blanks to create a Dense layer with He uniform initializer and ReLU activation.
TensorFlow
layer = tf.keras.layers.Dense(128, kernel_initializer=[1], activation=[2])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'sigmoid' activation with He uniform initializer is less common and may reduce performance.
Using 'glorot_normal' initializer with ReLU activation is valid but not the best match here.
✗ Incorrect
He uniform initializer is suitable for ReLU activation functions, which is why both are used together here.
5fill in blank
hardFill all three blanks to define a Sequential model with two Dense layers using Xavier initialization and softmax activation.
TensorFlow
model = tf.keras.Sequential([
tf.keras.layers.Dense(64, kernel_initializer=[1], activation=[2]),
tf.keras.layers.Dense(10, kernel_initializer=[3], activation='softmax')
]) Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'he_uniform' initializer in output layer is less common for softmax outputs.
Using 'relu' activation in output layer instead of 'softmax' is incorrect for classification.
✗ Incorrect
The first Dense layer uses 'glorot_uniform' initializer with 'relu' activation. The second Dense layer uses 'glorot_normal' initializer with 'softmax' activation for classification.