Text generation helps computers write new sentences or stories by learning from examples. It creates content that can be useful for many tasks.
0
0
Why text generation creates content in NLP
Introduction
When you want a computer to write a story or poem automatically.
When you need to generate replies in a chat or customer support.
When creating summaries or explanations from long documents.
When translating ideas into readable text for reports or emails.
When making creative content like jokes, scripts, or advertisements.
Syntax
NLP
model.generate(input_text, max_length=desired_length)
model.generate is a common function to create new text from a model.
input_text is the starting text the model uses to continue writing.
Examples
This starts a story with 'Once upon a time' and generates up to 50 tokens.
NLP
output = model.generate("Once upon a time", max_length=50)
This generates a short reply to a greeting.
NLP
response = model.generate("Hello, how are you?", max_length=20)
Sample Model
This code uses a GPT-2 model to continue the sentence 'Today is a beautiful day' and prints the full generated text.
NLP
from transformers import GPT2LMHeadModel, GPT2Tokenizer # Load a small GPT-2 model and tokenizer model_name = 'gpt2' tokenizer = GPT2Tokenizer.from_pretrained(model_name) model = GPT2LMHeadModel.from_pretrained(model_name) # Input text to start generation input_text = "Today is a beautiful day" inputs = tokenizer(input_text, return_tensors='pt') # Generate text continuation outputs = model.generate(**inputs, max_length=30, num_return_sequences=1) # Decode and print the generated text generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True) print(generated_text)
OutputSuccess
Important Notes
Text generation models learn from many examples to predict what words come next.
Generated content may sometimes be surprising or creative because the model tries to guess likely continuations.
Always check generated text for accuracy and appropriateness before using it.
Summary
Text generation creates new sentences by predicting words after a given start.
It is useful for writing stories, replies, summaries, and more.
Models like GPT-2 use this technique to produce human-like text.