Bird
0
0

Find the bug in this Bidirectional LSTM code snippet:

medium📝 Debug Q7 of 15
NLP - Sequence Models for NLP
Find the bug in this Bidirectional LSTM code snippet:
from tensorflow.keras.layers import Bidirectional, LSTM, Dense, Input
from tensorflow.keras.models import Model

inputs = Input(shape=(5, 10))
lstm = Bidirectional(LSTM(20, return_sequences=True))(inputs)
dense = Dense(1)(lstm)
model = Model(inputs, dense)
model.compile('adam', 'binary_crossentropy')
AInput shape is incorrect for LSTM
BDense layer applied to 3D tensor without flattening
CLoss function incompatible with output shape
DBidirectional wrapper missing merge_mode argument
Step-by-Step Solution
Solution:
  1. Step 1: Check output shape of Bidirectional LSTM

    With return_sequences=True and units=20, output shape is (batch, 5, 40).
  2. Step 2: Analyze Dense layer input

    Dense expects 2D input (batch, features), but receives 3D tensor. This causes shape mismatch error.
  3. Final Answer:

    Dense layer applied to 3D tensor without flattening -> Option B
  4. Quick Check:

    Dense needs 2D input; flatten or use TimeDistributed [OK]
Quick Trick: Dense layers need 2D input; flatten 3D outputs first [OK]
Common Mistakes:
MISTAKES
  • Ignoring tensor shape mismatch
  • Assuming input shape is wrong
  • Thinking merge_mode is mandatory

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes