0
0
TensorFlowml~5 mins

Bidirectional RNN in TensorFlow - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is a Bidirectional RNN?
A Bidirectional RNN is a type of recurrent neural network that processes data in both forward and backward directions. This helps the model understand context from past and future data points.
Click to reveal answer
beginner
Why use Bidirectional RNNs instead of regular RNNs?
Bidirectional RNNs capture information from both past and future inputs, improving understanding of context, which is useful in tasks like speech recognition and language modeling.
Click to reveal answer
intermediate
How does TensorFlow implement a Bidirectional RNN?
TensorFlow uses the `tf.keras.layers.Bidirectional` wrapper around RNN layers like LSTM or GRU to create a Bidirectional RNN that runs two RNNs in opposite directions and combines their outputs.
Click to reveal answer
intermediate
What are the outputs of a Bidirectional RNN layer?
The outputs are usually the concatenation of the forward and backward RNN outputs at each time step, giving richer information for the next layers.
Click to reveal answer
beginner
Write a simple TensorFlow code snippet to create a Bidirectional LSTM layer.
import tensorflow as tf
model = tf.keras.Sequential([
  tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(64, return_sequences=True), input_shape=(None, 10))
])
Click to reveal answer
What does a Bidirectional RNN process?
ARandom data order
BOnly forward data
COnly backward data
DData in both forward and backward directions
Which TensorFlow layer wraps an RNN to make it bidirectional?
Atf.keras.layers.Bidirectional
Btf.keras.layers.Dense
Ctf.keras.layers.Conv1D
Dtf.keras.layers.Dropout
What is a common use case for Bidirectional RNNs?
ASpeech recognition
BImage classification
CSorting numbers
DSimple linear regression
What does the output of a Bidirectional RNN usually contain?
AOnly forward outputs
BConcatenated forward and backward outputs
COnly backward outputs
DRandom noise
In TensorFlow, which argument is important to keep the sequence output in an LSTM layer inside a Bidirectional wrapper?
Adropout=0.5
Bactivation='relu'
Creturn_sequences=True
Dunits=32
Explain how a Bidirectional RNN works and why it is useful.
Think about reading a sentence forwards and backwards to understand it better.
You got /3 concepts.
    Describe how to implement a Bidirectional RNN in TensorFlow with an example.
    Remember the wrapper layer that runs two RNNs in opposite directions.
    You got /4 concepts.