Recall & Review
beginner
What is transfer learning in machine learning?
Transfer learning is a method where a model developed for one task is reused as the starting point for a model on a second task. It helps when you have a small dataset by using knowledge from a larger dataset.
Click to reveal answer
beginner
Why is transfer learning useful for small datasets?
Because small datasets often don't have enough data to train a model well from scratch, transfer learning uses a pre-trained model's knowledge to improve learning and reduce training time.
Click to reveal answer
intermediate
In TensorFlow, what is a common step after loading a pre-trained model for transfer learning?
A common step is to freeze the pre-trained layers so their weights don't change, then add new layers on top to train on the small dataset.
Click to reveal answer
beginner
What does 'freezing layers' mean in transfer learning?
Freezing layers means making the weights of some layers unchangeable during training, so the model keeps the learned features from the original task.
Click to reveal answer
intermediate
How can you adapt a pre-trained model to a new classification task with fewer classes?
You replace the final layer(s) of the pre-trained model with new layers that match the number of classes in your new task, then train only those layers or fine-tune some layers.
Click to reveal answer
What is the main advantage of using transfer learning on small datasets?
✗ Incorrect
Transfer learning leverages knowledge from a model trained on a large dataset to help train on a smaller dataset.
In TensorFlow, what does freezing layers do during transfer learning?
✗ Incorrect
Freezing layers means their weights stay fixed and do not update during training.
Which step is usually done first when applying transfer learning with TensorFlow?
✗ Incorrect
You start by loading a pre-trained model to reuse its learned features.
How do you handle a different number of classes in your new task compared to the pre-trained model?
✗ Incorrect
The final layer must be replaced to output predictions for the new classes.
What is a common practice to avoid overfitting when training on small datasets with transfer learning?
✗ Incorrect
Freezing most layers helps keep learned features and reduces overfitting on small data.
Explain how transfer learning helps when you have a small dataset. Include the role of freezing layers and adapting the final layer.
Think about reusing knowledge and adjusting the model for your task.
You got /4 concepts.
Describe the steps to apply transfer learning in TensorFlow for a new image classification task with limited data.
Focus on loading, freezing, replacing, compiling, and training.
You got /5 concepts.