0
0
Prompt Engineering / GenAIml~5 mins

Pre-training and fine-tuning concept in Prompt Engineering / GenAI - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is pre-training in machine learning?
Pre-training is the process where a model learns general patterns from a large dataset before being trained on a specific task. It's like learning the basics first before focusing on details.
Click to reveal answer
beginner
What does fine-tuning mean in AI model training?
Fine-tuning means adjusting a pre-trained model on a smaller, specific dataset to make it perform well on a particular task, like customizing a general skill to a special job.
Click to reveal answer
intermediate
Why do we use pre-training before fine-tuning?
Pre-training helps the model learn general knowledge, which saves time and data when fine-tuning. It’s like learning the alphabet before writing a story.
Click to reveal answer
beginner
Give a real-life example of pre-training and fine-tuning.
Imagine learning to drive (pre-training) before learning to drive a race car (fine-tuning). You first learn general driving skills, then adjust to the special car’s needs.
Click to reveal answer
intermediate
What is a benefit of fine-tuning a pre-trained model?
Fine-tuning allows the model to perform well on a new task with less data and time, making it faster and cheaper to get good results.
Click to reveal answer
What is the main goal of pre-training a model?
ATo learn general patterns from a large dataset
BTo train on a small specific task dataset
CTo test the model's accuracy
DTo delete unnecessary data
Fine-tuning is best described as:
ATraining a model from scratch
BAdjusting a pre-trained model on a specific task
CCollecting more data
DEvaluating model performance
Which of these is a benefit of pre-training?
AMakes the model slower
BRequires more data for each new task
CSaves time when training on new tasks
DRemoves the need for fine-tuning
What is an example of fine-tuning?
AAdjusting a general language model to understand medical terms
BLearning to read before writing
CTraining a model on a large dataset
DDeleting old training data
Why is fine-tuning important after pre-training?
AIt removes errors from pre-training
BIt helps the model forget old knowledge
CIt increases the model size
DIt customizes the model for a specific task
Explain in your own words what pre-training and fine-tuning mean and how they work together.
Think about learning general skills first, then specializing.
You got /4 concepts.
    Describe a simple example from daily life that helps you understand why pre-training and fine-tuning are useful.
    Consider learning a basic skill before a specialized one.
    You got /4 concepts.