0
0
TensorFlowml~10 mins

Text generation with RNN in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a simple RNN layer in TensorFlow.

TensorFlow
rnn_layer = tf.keras.layers.SimpleRNN([1], return_sequences=True)
Drag options to blanks, or click blank then click option'
Arelu
B0.1
CTrue
D128
Attempts:
3 left
💡 Hint
Common Mistakes
Using a float value instead of an integer for units.
Using activation function name instead of units.
Setting return_sequences instead of units.
2fill in blank
medium

Complete the code to compile the model with an appropriate loss function for text generation.

TensorFlow
model.compile(optimizer='adam', loss=[1])
Drag options to blanks, or click blank then click option'
A'categorical_crossentropy'
B'mse'
C'binary_crossentropy'
D'hinge'
Attempts:
3 left
💡 Hint
Common Mistakes
Using mean squared error which is for regression.
Using binary crossentropy which is for two-class problems.
3fill in blank
hard

Fix the error in the code to correctly prepare input sequences for training.

TensorFlow
sequences = [text[i:i+[1]] for i in range(len(text) - seq_length)]
Drag options to blanks, or click blank then click option'
Aseq_length
Blen(text)
Ci
Dbatch_size
Attempts:
3 left
💡 Hint
Common Mistakes
Using the entire text length instead of the sequence length.
Using the loop variable i as the slice length.
4fill in blank
hard

Fill both blanks to define the model input shape and output layer activation for text generation.

TensorFlow
model = tf.keras.Sequential([
  tf.keras.layers.Embedding(input_dim=vocab_size, output_dim=64, input_length=[1]),
  tf.keras.layers.SimpleRNN(128),
  tf.keras.layers.Dense(vocab_size, activation=[2])
])
Drag options to blanks, or click blank then click option'
Aseq_length
B'softmax'
C'relu'
Dvocab_size
Attempts:
3 left
💡 Hint
Common Mistakes
Using vocab_size as input_length.
Using relu activation in the output layer.
5fill in blank
hard

Fill all three blanks to generate text from the trained model step-by-step.

TensorFlow
input_eval = [char2idx[[1]]]
input_eval = tf.expand_dims(input_eval, 0)
predictions = model(input_eval)
predicted_id = tf.random.categorical(predictions[:, [2]], num_samples=1)[[3]][0,0].numpy()
Drag options to blanks, or click blank then click option'
A'start_char'
B0
C1
D'end_char'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'end_char' instead of 'start_char' to begin generation.
Using wrong indices for predictions slicing.