0
0
NLPml~10 mins

Why text generation creates content in NLP - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to generate text using a simple model.

NLP
generated_text = model.[1](input_sequence)
Drag options to blanks, or click blank then click option'
Apredict
Bgenerate
Cfit
Dcompile
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'fit' instead of 'predict' which trains the model rather than generating output.
Using 'compile' which prepares the model but does not generate output.
2fill in blank
medium

Complete the code to prepare input text for the model.

NLP
input_sequence = tokenizer.[1](raw_text)
Drag options to blanks, or click blank then click option'
Adecode
Bfit
Cencode
Dtransform
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'decode' which converts tokens back to text.
Using 'fit' which is for training the tokenizer.
3fill in blank
hard

Fix the error in the code to generate text from the model.

NLP
output = model.predict([1])
Drag options to blanks, or click blank then click option'
Araw_text
Binput_sequence
Ctokenizer
Dmodel
Attempts:
3 left
💡 Hint
Common Mistakes
Passing raw text directly causes errors.
Passing the tokenizer or model object instead of input data.
4fill in blank
hard

Fill both blanks to convert model output tokens back to readable text.

NLP
decoded_text = tokenizer.[1](output_tokens, [2]=True)
Drag options to blanks, or click blank then click option'
Adecode
Bencode
Cskip_special_tokens
Dadd_special_tokens
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'encode' instead of 'decode' reverses the process.
Not skipping special tokens results in extra symbols in output.
5fill in blank
hard

Fill all three blanks to generate text and decode it properly.

NLP
input_seq = tokenizer.[1](text)
output_tokens = model.[2](input_seq)
result = tokenizer.[3](output_tokens, skip_special_tokens=True)
Drag options to blanks, or click blank then click option'
Aencode
Bpredict
Cdecode
Dfit
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'fit' instead of 'encode' or 'predict'.
Decoding before prediction causes errors.