Complete the code to create a dense layer with 10 units.
layer = tf.keras.layers.Dense([1])The number 10 specifies the number of neurons in the dense layer.
Complete the code to add a ReLU activation to the dense layer.
layer = tf.keras.layers.Dense(10, activation=[1])
The 'relu' activation function is commonly used for hidden layers to add non-linearity.
Fix the error in the code by completing the missing argument for input shape.
model = tf.keras.Sequential([
tf.keras.layers.Dense(10, activation='relu', input_shape=[1])
])The input_shape argument expects a tuple describing the shape of each input sample.
Fill both blanks to create a model with two dense layers: first with 64 units and ReLU activation, second with 10 units and softmax activation.
model = tf.keras.Sequential([
tf.keras.layers.Dense([1], activation='relu'),
tf.keras.layers.Dense([2], activation='softmax')
])The first dense layer has 64 units with ReLU activation, and the second has 10 units with softmax for classification.
Fill all three blanks to create a dense layer with 128 units, 'tanh' activation, and use it as the first layer with input shape (20,).
model = tf.keras.Sequential([
tf.keras.layers.Dense([1], activation=[2], input_shape=[3])
])The dense layer has 128 units, uses 'tanh' activation, and expects input samples shaped (20,).