0
0
TensorFlowml~10 mins

Bidirectional RNN in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a bidirectional RNN layer using TensorFlow.

TensorFlow
import tensorflow as tf

model = tf.keras.Sequential([
    tf.keras.layers.[1](
        tf.keras.layers.SimpleRNN(32), input_shape=(10, 8))
])
Drag options to blanks, or click blank then click option'
ABidirectional
BDense
CConv1D
DDropout
Attempts:
3 left
💡 Hint
Common Mistakes
Using Dense or Conv1D layers instead of Bidirectional for RNNs.
Not wrapping the RNN layer inside the Bidirectional wrapper.
2fill in blank
medium

Complete the code to get the output shape of a bidirectional RNN layer.

TensorFlow
import tensorflow as tf

layer = tf.keras.layers.Bidirectional(tf.keras.layers.SimpleRNN(16))
input_data = tf.random.normal([5, 10, 8])
output = layer(input_data)
print(output.shape)  # Expected shape: (5, [1])
Drag options to blanks, or click blank then click option'
A16
B32
C8
D10
Attempts:
3 left
💡 Hint
Common Mistakes
Expecting output shape to be (5, 16) instead of (5, 32).
Confusing batch size with feature size.
3fill in blank
hard

Fix the error in the code to correctly create a bidirectional LSTM layer.

TensorFlow
import tensorflow as tf

model = tf.keras.Sequential([
    tf.keras.layers.Bidirectional(tf.keras.layers.[1](64), input_shape=(20, 10))
])
Drag options to blanks, or click blank then click option'
ALSTM
BDense
CSimpleRNN
DConv2D
Attempts:
3 left
💡 Hint
Common Mistakes
Using Dense or Conv2D layers inside Bidirectional.
Using SimpleRNN when LSTM is required.
4fill in blank
hard

Fill both blanks to create a bidirectional GRU layer that returns sequences.

TensorFlow
import tensorflow as tf

layer = tf.keras.layers.Bidirectional(
    tf.keras.layers.[1](128, return_sequences=[2])
)
Drag options to blanks, or click blank then click option'
AGRU
BTrue
CFalse
DSimpleRNN
Attempts:
3 left
💡 Hint
Common Mistakes
Setting return_sequences to False when sequences are needed.
Using SimpleRNN instead of GRU.
5fill in blank
hard

Fill all three blanks to build a bidirectional RNN model with an LSTM layer, a dropout layer, and a dense output layer.

TensorFlow
import tensorflow as tf

model = tf.keras.Sequential([
    tf.keras.layers.Bidirectional(tf.keras.layers.[1](64), input_shape=(15, 12)),
    tf.keras.layers.[2](0.5),
    tf.keras.layers.[3](10, activation='softmax')
])
Drag options to blanks, or click blank then click option'
ALSTM
BDropout
CDense
DSimpleRNN
Attempts:
3 left
💡 Hint
Common Mistakes
Using SimpleRNN instead of LSTM for the first layer.
Forgetting to add Dropout for regularization.
Using Dense without activation or wrong activation.