Bird
0
0

Examine this code snippet:

medium📝 Debug Q6 of 15
NLP - Sequence Models for NLP
Examine this code snippet:
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Bidirectional, LSTM

model = Sequential()
model.add(Bidirectional(LSTM(32, input_shape=(10, 8))))
model.add(LSTM(16))
model.compile(optimizer='adam', loss='mse')

What is the likely cause of an error during training?
AThe optimizer 'adam' is not compatible with LSTM layers.
BBidirectional wrapper cannot be used with input_shape argument.
CThe second LSTM layer lacks return_sequences=true, causing shape mismatch.
DThe loss function 'mse' is invalid for regression tasks.
Step-by-Step Solution
Solution:
  1. Step 1: Analyze layer outputs

    Bidirectional LSTM outputs sequences by default unless return_sequences=false.
  2. Step 2: Check second LSTM input

    The second LSTM expects 3D input if return_sequences=true in previous layer; otherwise, shape mismatch occurs.
  3. Step 3: Identify error

    Since first layer outputs sequences, second LSTM must have return_sequences=true or input shape must match.
  4. Final Answer:

    The second LSTM layer lacks return_sequences=true, causing shape mismatch. -> Option C
  5. Quick Check:

    Check return_sequences for stacked LSTMs [OK]
Quick Trick: Stacked LSTMs need return_sequences=true except last [OK]
Common Mistakes:
MISTAKES
  • Assuming input_shape cannot be in Bidirectional
  • Blaming optimizer or loss function incorrectly

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes