0
0
TensorFlowml~10 mins

Dense (fully connected) layers in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a dense layer with 10 units.

TensorFlow
layer = tf.keras.layers.Dense([1])
Drag options to blanks, or click blank then click option'
A0
BNone
C10
D5
Attempts:
3 left
💡 Hint
Common Mistakes
Using 0 or None will cause errors or no neurons to be created.
Choosing a smaller number like 5 does not meet the requirement.
2fill in blank
medium

Complete the code to add a ReLU activation to the dense layer.

TensorFlow
layer = tf.keras.layers.Dense(10, activation=[1])
Drag options to blanks, or click blank then click option'
A'relu'
B'sigmoid'
C'softmax'
D'tanh'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'sigmoid' or 'softmax' activates differently and are usually for output layers.
Forgetting quotes around the activation name causes syntax errors.
3fill in blank
hard

Fix the error in the code by completing the missing argument for input shape.

TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.Dense(10, activation='relu', input_shape=[1])
])
Drag options to blanks, or click blank then click option'
A(10,)
B{10}
C[10]
D10
Attempts:
3 left
💡 Hint
Common Mistakes
Using an integer alone causes errors because shape must be a tuple.
Using list or set brackets is invalid for input_shape.
4fill in blank
hard

Fill both blanks to create a model with two dense layers: first with 64 units and ReLU activation, second with 10 units and softmax activation.

TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.Dense([1], activation='relu'),
    tf.keras.layers.Dense([2], activation='softmax')
])
Drag options to blanks, or click blank then click option'
A64
B10
C32
D5
Attempts:
3 left
💡 Hint
Common Mistakes
Swapping the number of units between layers.
Using incorrect numbers like 32 or 5 which do not match the task.
5fill in blank
hard

Fill all three blanks to create a dense layer with 128 units, 'tanh' activation, and use it as the first layer with input shape (20,).

TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.Dense([1], activation=[2], input_shape=[3])
])
Drag options to blanks, or click blank then click option'
A128
B'tanh'
C(20,)
D'relu'
Attempts:
3 left
💡 Hint
Common Mistakes
Using wrong activation like 'relu' instead of 'tanh'.
Not using a tuple for input_shape.
Using wrong number of units.