Complete the code to create a simple sequential model with one dense layer.
from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense model = Sequential() model.add(Dense([1], input_shape=(10,), activation='relu'))
The first Dense layer needs the number of units as an integer. Here, 32 units is a common choice.
Complete the code to compile the model with mean squared error loss.
model.compile(optimizer='adam', loss='[1]', metrics=['accuracy'])
For regression tasks, mean squared error is a common loss function.
Fix the error in the code to add a dropout layer after the first dense layer.
from tensorflow.keras.layers import Dropout model = Sequential() model.add(Dense(64, input_shape=(20,), activation='relu')) model.add([1](0.5))
Dropout layer is added by calling Dropout with the dropout rate.
Fill both blanks to create a sequential model with two dense layers and compile it with Adam optimizer and categorical crossentropy loss.
model = Sequential() model.add(Dense([1], activation='relu', input_shape=(15,))) model.add(Dense([2], activation='softmax')) model.compile(optimizer='[3]', loss='categorical_crossentropy')
The first dense layer has 64 units with relu activation. The second dense layer has 10 units for 10 classes with softmax activation. Adam optimizer is used for compiling.
Fill all three blanks to create a sequential model with input shape, two dense layers, and compile it with RMSprop optimizer and binary crossentropy loss.
model = Sequential() model.add(Dense([1], activation='relu', input_shape=([2],))) model.add(Dense([3], activation='sigmoid')) model.compile(optimizer='rmsprop', loss='binary_crossentropy')
The first dense layer has 128 units with relu activation and input shape of 20 features. The output layer has 1 unit with sigmoid activation for binary classification.