0
0
TensorFlowml~20 mins

Bidirectional RNN in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Bidirectional RNN Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Understanding Bidirectional RNN Output Shape

Consider a bidirectional RNN layer in TensorFlow with units=32 and return_sequences=True. If the input shape is (batch_size, timesteps, features), what will be the output shape of this layer?

A(batch_size, 64)
B(batch_size, 32)
C(batch_size, timesteps, 32)
D(batch_size, timesteps, 64)
Attempts:
2 left
💡 Hint

Remember that a bidirectional RNN concatenates outputs from forward and backward passes.

Predict Output
intermediate
2:00remaining
Output of Bidirectional LSTM Layer

What is the output of the following TensorFlow code snippet?

TensorFlow
import tensorflow as tf
import numpy as np

inputs = tf.constant(np.random.random((1, 5, 10)), dtype=tf.float32)
layer = tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(4, return_sequences=False))
output = layer(inputs)
print(output.shape)
A(1, 4)
B(1, 8)
C(5, 8)
D(5, 4)
Attempts:
2 left
💡 Hint

Check the return_sequences parameter and how bidirectional layers concatenate outputs.

Model Choice
advanced
2:00remaining
Choosing the Right Bidirectional RNN for Sequence Tagging

You want to build a model to tag each word in a sentence with its part of speech. Which bidirectional RNN layer configuration is best suited for this task?

ASimple RNN with <code>return_sequences=False</code>
BBidirectional LSTM with <code>return_sequences=False</code>
CBidirectional GRU with <code>return_sequences=True</code>
DBidirectional LSTM with <code>return_sequences=False</code> and dropout=0.5
Attempts:
2 left
💡 Hint

Think about whether you need output for each timestep or just one output per sequence.

Hyperparameter
advanced
2:00remaining
Effect of Increasing Units in Bidirectional RNN

What is the most likely effect of doubling the number of units in a bidirectional RNN layer on model training?

ATraining time increases and model capacity increases, possibly improving accuracy
BTraining time decreases and model capacity decreases, reducing accuracy
CTraining time stays the same but model capacity decreases
DTraining time increases but model capacity stays the same
Attempts:
2 left
💡 Hint

More units mean more parameters to learn.

🔧 Debug
expert
3:00remaining
Identifying the Cause of Shape Mismatch in Bidirectional RNN

Given the following code, what is the cause of the error?

import tensorflow as tf
inputs = tf.keras.Input(shape=(10, 8))
# Bidirectional LSTM with 16 units
x = tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(16))(inputs)
outputs = tf.keras.layers.Dense(1)(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')

# Trying to train with input shape (32, 10, 8) and target shape (32, 10, 1)
x_train = tf.random.normal((32, 10, 8))
y_train = tf.random.normal((32, 10, 1))
model.fit(x_train, y_train, epochs=1)
AThe LSTM layer should have return_sequences=True to output sequence for each timestep
BThe input shape is incorrect; it should be (10, 1) instead of (10, 8)
CThe Dense layer output units should be 10 instead of 1
DThe model output shape is (32, 1) but target shape is (32, 10, 1), causing mismatch
Attempts:
2 left
💡 Hint

Check if the model output shape matches the target shape.