Challenge - 5 Problems
GRU Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output shape of a GRU layer
What is the output shape of the following GRU layer when input shape is (batch_size=32, timesteps=10, features=8)?
TensorFlow
import tensorflow as tf layer = tf.keras.layers.GRU(16) input_shape = (32, 10, 8) import numpy as np x = np.random.random(input_shape).astype('float32') out = layer(x) print(out.shape)
Attempts:
2 left
💡 Hint
By default, GRU returns the last output for each sample, not the full sequence.
✗ Incorrect
The GRU layer with default settings returns the last output for each sample, so the output shape is (batch_size, units) which is (32, 16).
❓ Model Choice
intermediate1:30remaining
Choosing GRU for sequence data
You want to build a model to predict the next word in a sentence using a recurrent neural network. Which layer is best suited for capturing long-term dependencies efficiently?
Attempts:
2 left
💡 Hint
GRU is designed to handle long-term dependencies better than SimpleRNN.
✗ Incorrect
GRU layers are designed to capture long-term dependencies in sequences more efficiently than SimpleRNN layers, making them suitable for language modeling.
❓ Hyperparameter
advanced1:30remaining
Effect of return_sequences in GRU
What is the effect of setting return_sequences=True in a GRU layer?
Attempts:
2 left
💡 Hint
Think about whether the output includes all timesteps or just the last one.
✗ Incorrect
Setting return_sequences=True makes the GRU return the output at every timestep, resulting in a 3D tensor of shape (batch_size, timesteps, units).
🔧 Debug
advanced2:00remaining
Identifying error in GRU input shape
What error will this code raise and why?
import tensorflow as tf
layer = tf.keras.layers.GRU(10)
x = tf.random.uniform((5, 10))
out = layer(x)
TensorFlow
import tensorflow as tf layer = tf.keras.layers.GRU(10) x = tf.random.uniform((5, 10)) out = layer(x)
Attempts:
2 left
💡 Hint
Check the input tensor dimensions expected by GRU layers.
✗ Incorrect
GRU layers expect 3D input tensors (batch_size, timesteps, features). The input here is 2D, causing a ValueError.
❓ Metrics
expert2:30remaining
Interpreting GRU training loss and accuracy
A GRU model for binary classification is trained for 10 epochs. The training loss steadily decreases, but training accuracy remains around 50%. What is the most likely explanation?
Attempts:
2 left
💡 Hint
If loss decreases but accuracy stays at chance level, what could be wrong?
✗ Incorrect
If loss decreases but accuracy stays near 50%, the model might be outputting probabilities but the final classification or loss function is misconfigured.