Recall & Review
beginner
What is a transformer model in simple terms?
A transformer is a type of AI model that reads and understands data by paying attention to all parts at once, like reading a whole sentence to get the meaning.
Click to reveal answer
beginner
Why do different transformer models exist for different tasks?
Different tasks need different skills. So, transformers are changed or trained differently to be good at tasks like translating languages, answering questions, or recognizing images.
Click to reveal answer
intermediate
How does the size of a transformer affect its task?
Bigger transformers can learn more details and handle harder tasks, but they need more computer power. Smaller ones are faster but may not be as smart.
Click to reveal answer
beginner
What role does training data play in making transformers good at different tasks?
Transformers learn from examples. If they see many examples of a task, like translating, they get better at it. Different data helps them focus on different skills.
Click to reveal answer
intermediate
What is fine-tuning in transformers?
Fine-tuning is like teaching a transformer a new skill after it already learned general things. It helps the model do a specific job better, like answering questions about medicine.
Click to reveal answer
Why do we use different transformer models for different tasks?
✗ Incorrect
Different tasks require transformers to learn different patterns and skills, so models are adapted or trained differently.
What does fine-tuning a transformer mean?
✗ Incorrect
Fine-tuning means taking a model already trained on general data and training it more on task-specific data.
How does the size of a transformer model affect its performance?
✗ Incorrect
Larger models can capture more details but require more computing power.
What is the main reason transformers pay attention to all parts of input data?
✗ Incorrect
Attention helps transformers understand how different parts relate, improving understanding.
What happens if a transformer is trained on data from many tasks?
✗ Incorrect
Training on many tasks helps build general knowledge, but fine-tuning improves specific task performance.
Explain why different transformer models are designed or trained for different tasks.
Think about how learning a new skill requires focused practice.
You got /3 concepts.
Describe how model size and training data affect a transformer's ability to perform tasks.
Consider how a bigger brain and more practice help a person do better.
You got /3 concepts.