0
0
TensorFlowml~10 mins

SimpleRNN layer in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a SimpleRNN layer with 50 units.

TensorFlow
from tensorflow.keras.layers import SimpleRNN
rnn_layer = SimpleRNN([1])
Drag options to blanks, or click blank then click option'
A50
B10
C100
D5
Attempts:
3 left
💡 Hint
Common Mistakes
Using a number too small or too large without reason.
Forgetting to specify the number of units.
2fill in blank
medium

Complete the code to add a SimpleRNN layer to a Sequential model.

TensorFlow
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import SimpleRNN
model = Sequential()
model.add(SimpleRNN([1], input_shape=(10, 8)))
Drag options to blanks, or click blank then click option'
A8
B32
C16
D64
Attempts:
3 left
💡 Hint
Common Mistakes
Not specifying input_shape in the first layer.
Choosing a number of units that is too small or too large.
3fill in blank
hard

Fix the error in the SimpleRNN layer creation by completing the code.

TensorFlow
from tensorflow.keras.layers import SimpleRNN
rnn = SimpleRNN(units=[1], return_sequences=True)
Drag options to blanks, or click blank then click option'
A20
B'20'
CNone
DTrue
Attempts:
3 left
💡 Hint
Common Mistakes
Passing units as a string instead of an integer.
Using invalid types like None or True for units.
4fill in blank
hard

Fill both blanks to create a SimpleRNN layer that returns sequences and uses tanh activation.

TensorFlow
rnn_layer = SimpleRNN([1], activation=[2], return_sequences=True)
Drag options to blanks, or click blank then click option'
A40
B'relu'
C'tanh'
D30
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'relu' instead of 'tanh' for activation.
Not setting return_sequences=True when needed.
5fill in blank
hard

Fill all three blanks to build a Sequential model with a SimpleRNN layer followed by a Dense output layer.

TensorFlow
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import SimpleRNN, Dense
model = Sequential([
    SimpleRNN([1], input_shape=(15, 10), activation=[2]),
    Dense([3], activation='softmax')
])
Drag options to blanks, or click blank then click option'
A64
B'relu'
C5
D32
Attempts:
3 left
💡 Hint
Common Mistakes
Mixing up the order of layers.
Using wrong activation functions.
Incorrect output size in Dense layer.