NLP - Text GenerationWhich of the following is the correct way to define a simple RNN layer in TensorFlow Keras for text generation?Atf.keras.layers.SimpleRNN(128, input_shape=(vocab_size, None))Btf.keras.layers.SimpleRNN(units=128, input_shape=(None, vocab_size))Ctf.keras.layers.SimpleRNN(units=128, input_length=vocab_size)Dtf.keras.layers.SimpleRNN(input_dim=vocab_size)Check Answer
Step-by-Step SolutionSolution:Step 1: Recall correct input shape formatFor RNNs, input_shape is (timesteps, features). Here, timesteps is None (variable length), features is vocab_size.Step 2: Check syntax correctnesstf.keras.layers.SimpleRNN(units=128, input_shape=(None, vocab_size)) correctly uses units=128 and input_shape=(None, vocab_size). Others misuse input_dim or input_length or swap dimensions.Final Answer:tf.keras.layers.SimpleRNN(units=128, input_shape=(None, vocab_size)) -> Option BQuick Check:Correct RNN input shape = (timesteps, features) [OK]Quick Trick: RNN input_shape = (timesteps, features) always [OK]Common Mistakes:MISTAKESSwapping timesteps and features in input_shapeUsing input_dim instead of input_shapeConfusing input_length with input_shape
Master "Text Generation" in NLP9 interactive learning modes - each teaches the same concept differentlyLearnWhyDeepModelTryChallengeExperimentRecallMetrics
More NLP Quizzes Sentiment Analysis Advanced - Multilingual sentiment - Quiz 4medium Sequence Models for NLP - Why sequence models understand word order - Quiz 5medium Sequence Models for NLP - Why sequence models understand word order - Quiz 13medium Sequence Models for NLP - Attention mechanism basics - Quiz 3easy Text Similarity and Search - Edit distance (Levenshtein) - Quiz 4medium Text Similarity and Search - Semantic similarity with embeddings - Quiz 6medium Topic Modeling - LDA with scikit-learn - Quiz 12easy Topic Modeling - Choosing number of topics - Quiz 1easy Word Embeddings - Training Word2Vec with Gensim - Quiz 2easy Word Embeddings - FastText embeddings - Quiz 13medium