Complete the code to create a simple RNN layer in TensorFlow.
rnn_layer = tf.keras.layers.SimpleRNN([1], return_sequences=True)
The number 128 specifies the number of units in the RNN layer, which controls the output dimension.
Complete the code to compile the model with an appropriate loss function for text generation.
model.compile(optimizer='adam', loss=[1])
For text generation with multiple classes (characters or words), categorical_crossentropy is the correct loss function.
Fix the error in the code to correctly prepare input sequences for training.
sequences = [text[i:i+[1]] for i in range(len(text) - seq_length)]
The slice length should be seq_length to get fixed-size input sequences.
Fill both blanks to define the model input shape and output layer activation for text generation.
model = tf.keras.Sequential([ tf.keras.layers.Embedding(input_dim=vocab_size, output_dim=64, input_length=[1]), tf.keras.layers.SimpleRNN(128), tf.keras.layers.Dense(vocab_size, activation=[2]) ])
The input_length should be seq_length to match input sequences. The output activation is softmax to predict probabilities over the vocabulary.
Fill all three blanks to generate text from the trained model step-by-step.
input_eval = [char2idx[[1]]] input_eval = tf.expand_dims(input_eval, 0) predictions = model(input_eval) predicted_id = tf.random.categorical(predictions[:, [2]], num_samples=1)[[3]][0,0].numpy()
We start with the 'start_char' to begin generation. We sample from predictions at index 0 (the first time step). The [1] selects the sample dimension.