Overview - Freezing layers
What is it?
Freezing layers means stopping some parts of a neural network from learning during training. When layers are frozen, their values do not change, so the model keeps what it already knows in those parts. This is useful when you want to keep some knowledge fixed and only train other parts. It helps save time and avoid forgetting useful information.
Why it matters
Freezing layers lets us reuse knowledge from a model trained on one task to help with a new task. Without freezing, training might erase what the model already learned, making it slower or less accurate. This is important in real life when data is limited or training is expensive. It helps build smarter AI faster and with less data.
Where it fits
Before learning freezing layers, you should understand how neural networks train and update weights using gradients. After this, you can learn transfer learning, fine-tuning, and how to build efficient models by combining frozen and trainable parts.