Complete the code to add a Flatten layer to the model.
model = tf.keras.Sequential([
[1](input_shape=(28, 28, 1))
])The Flatten layer converts the 2D image input into a 1D vector, which is needed before feeding into Dense layers.
Complete the code to add a Dense layer with 128 units and ReLU activation.
model.add([1](128, activation='relu'))
The Dense layer creates a fully connected layer with 128 neurons and ReLU activation.
Fix the error in the code to correctly add a Dense output layer with 10 units and softmax activation.
model.add(Dense([1], activation='softmax'))
The output layer should have 10 units (one per class) as an integer, not a string or other value.
Fill both blanks to create a model with a Flatten layer followed by a Dense layer with 64 units.
model = tf.keras.Sequential([
[1](input_shape=(32, 32, 3)),
[2](64, activation='relu')
])The Flatten layer converts the input to 1D, then the Dense layer creates 64 neurons with ReLU activation.
Fill all three blanks to build a model with Flatten, Dense 128 ReLU, and Dense 10 softmax layers.
model = tf.keras.Sequential([
[1](input_shape=(28, 28, 1)),
[2](128, activation='relu'),
[3](10, activation='softmax')
])The model first flattens input, then adds a Dense layer with 128 units and ReLU, and finally a Dense output layer with 10 units and softmax.