0
0
NLPml~20 mins

Bidirectional LSTM in NLP - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Bidirectional LSTM Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output shape of Bidirectional LSTM layer
Consider the following Keras code snippet that creates a Bidirectional LSTM layer. What is the shape of the output tensor after passing an input batch of shape (32, 10, 8) through this layer?

from tensorflow.keras.layers import Bidirectional, LSTM
from tensorflow.keras.models import Sequential

model = Sequential()
model.add(Bidirectional(LSTM(16, return_sequences=False), input_shape=(10, 8)))

output = model.layers[0].output_shape
NLP
from tensorflow.keras.layers import Bidirectional, LSTM
from tensorflow.keras.models import Sequential

model = Sequential()
model.add(Bidirectional(LSTM(16, return_sequences=False), input_shape=(10, 8)))

output = model.layers[0].output_shape
A(None, 16)
B(None, 32)
C(None, 64)
D(None, 10, 32)
Attempts:
2 left
💡 Hint
Remember that Bidirectional doubles the units of the LSTM output when return_sequences=False.
Model Choice
intermediate
2:00remaining
Choosing Bidirectional LSTM for sequence classification
You want to build a model to classify movie reviews as positive or negative based on the text. Which model architecture below best uses a Bidirectional LSTM for this task?
AEmbedding -> Dense(64, activation='relu') -> Bidirectional LSTM (units=64) -> Dense(1, activation='sigmoid')
BEmbedding -> LSTM (units=64, return_sequences=True) -> Dense(1, activation='sigmoid')
CEmbedding -> Bidirectional LSTM (units=64, return_sequences=True) -> Dense(1, activation='sigmoid')
DEmbedding -> Bidirectional LSTM (units=64, return_sequences=False) -> Dense(1, activation='sigmoid')
Attempts:
2 left
💡 Hint
For classification, the LSTM should output a single vector per sample, not a sequence.
Hyperparameter
advanced
2:00remaining
Effect of return_sequences in Bidirectional LSTM
What is the effect of setting return_sequences=True in a Bidirectional LSTM layer in Keras?
AThe layer returns the input sequence unchanged.
BThe layer returns the cell state instead of the hidden state.
CThe layer returns the hidden state for each time step, producing a 3D tensor output.
DThe layer returns only the last hidden state as a 2D tensor output.
Attempts:
2 left
💡 Hint
Think about whether the output keeps the time dimension or not.
Metrics
advanced
2:00remaining
Interpreting training metrics of Bidirectional LSTM
You train a Bidirectional LSTM model for sentiment analysis. After 10 epochs, training accuracy is 95% but validation accuracy is 60%. What does this indicate?
AThe model is overfitting the training data and not generalizing well.
BThe model is underfitting and needs more training epochs.
CThe model has a bug causing incorrect validation evaluation.
DThe validation data is easier than the training data.
Attempts:
2 left
💡 Hint
High training accuracy but low validation accuracy usually means the model memorizes training data.
🔧 Debug
expert
3:00remaining
Debugging shape mismatch in Bidirectional LSTM model
You have this Keras model code snippet:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding, Bidirectional, LSTM, Dense

model = Sequential()
model.add(Embedding(input_dim=1000, output_dim=64, input_length=20))
model.add(Bidirectional(LSTM(32)))
model.add(Dense(10, activation='softmax'))

model.compile(optimizer='adam', loss='categorical_crossentropy')

# You try to train with labels shape (batch_size, 20, 10) but get a shape error.


What is the cause of the shape error?
NLP
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding, Bidirectional, LSTM, Dense

model = Sequential()
model.add(Embedding(input_dim=1000, output_dim=64, input_length=20))
model.add(Bidirectional(LSTM(32)))
model.add(Dense(10, activation='softmax'))

model.compile(optimizer='adam', loss='categorical_crossentropy')

# Training labels shape: (batch_size, 20, 10)
AThe model outputs shape (batch_size, 10) but labels have shape (batch_size, 20, 10), causing mismatch.
BThe Embedding layer output shape is incompatible with LSTM input shape.
CThe Dense layer activation 'softmax' is invalid for multi-class classification.
DThe loss function 'categorical_crossentropy' requires labels to be integers, not one-hot encoded.
Attempts:
2 left
💡 Hint
Check the output shape of the Bidirectional LSTM and the shape of the labels.