Bird
0
0

Find the bug in this LSTM model training code:

medium📝 Debug Q7 of 15
NLP - Sequence Models for NLP
Find the bug in this LSTM model training code:
model = Sequential()
model.add(LSTM(128, input_shape=(20, 30)))
model.add(Dense(5, activation='softmax'))
model.compile(loss='mse', optimizer='adam')
model.fit(X_train, y_train, epochs=10)
AMissing batch size in input_shape
BLoss function 'mse' is inappropriate for classification
CSoftmax activation should be replaced with sigmoid
DLSTM units must match output classes
Step-by-Step Solution
Solution:
  1. Step 1: Identify task type

    Output layer with 5 units and softmax suggests multi-class classification.
  2. Step 2: Check loss function suitability

    Mean squared error (mse) is for regression, not classification; categorical_crossentropy is correct.
  3. Final Answer:

    Loss function 'mse' is inappropriate for classification -> Option B
  4. Quick Check:

    Classification needs crossentropy loss [OK]
Quick Trick: Use crossentropy loss for classification tasks [OK]
Common Mistakes:
MISTAKES
  • Using mse loss for classification
  • Confusing batch size with input_shape
  • Changing softmax to sigmoid incorrectly

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes