0
0
TensorFlowml~12 mins

Freezing and unfreezing layers in TensorFlow - Model Pipeline Trace

Choose your learning style9 modes available
Model Pipeline - Freezing and unfreezing layers

This pipeline shows how a model uses frozen layers to keep learned features fixed, then unfreezes layers to fine-tune and improve accuracy.

Data Flow - 6 Stages
1Data input
1000 rows x 32 x 32 x 3Load images with labels1000 rows x 32 x 32 x 3
Image of a cat with label 'cat'
2Preprocessing
1000 rows x 32 x 32 x 3Normalize pixel values to 0-1 range1000 rows x 32 x 32 x 3
Pixel value 120 becomes 0.47
3Feature extraction (frozen layers)
1000 rows x 32 x 32 x 3Pass through frozen convolutional layers1000 rows x 8 x 8 x 64
Feature map highlighting edges
4Train classifier layers
1000 rows x 8 x 8 x 64Train dense layers on extracted features1000 rows x 10 (class probabilities)
[0.1, 0.7, 0.05, ..., 0.02]
5Unfreeze layers
1000 rows x 8 x 8 x 64Unfreeze some convolutional layers for fine-tuning1000 rows x 8 x 8 x 64
Layers now trainable to adjust features
6Fine-tune entire model
1000 rows x 8 x 8 x 64Train all layers with lower learning rate1000 rows x 10 (class probabilities)
[0.05, 0.8, 0.03, ..., 0.01]
Training Trace - Epoch by Epoch

Epoch 1: ************ (loss=1.2)
Epoch 2: ******** (loss=0.9)
Epoch 3: ******* (loss=0.7)
Epoch 4: ****** (loss=0.65)
Epoch 5: **** (loss=0.5)
Epoch 6: *** (loss=0.45)
EpochLoss ↓Accuracy ↑Observation
11.20.55Training classifier layers with frozen base
20.90.68Accuracy improves as classifier learns
30.70.75Stable improvement with frozen base
40.650.78Unfreeze some layers for fine-tuning
50.50.85Fine-tuning improves feature extraction
60.450.88Model converges with unfreezing
Prediction Trace - 5 Layers
Layer 1: Input layer
Layer 2: Frozen convolutional layers
Layer 3: Trainable dense layers
Layer 4: Unfreeze and fine-tune convolutional layers
Layer 5: Final dense layers
Model Quiz - 3 Questions
Test your understanding
Why do we freeze layers before training the classifier layers?
ATo keep learned features fixed and speed up training
BTo prevent the model from learning anything new
CTo make the model ignore the input data
DTo randomly change weights in the frozen layers
Key Insight
Freezing layers helps keep useful features stable while training new parts. Unfreezing allows the model to improve by adjusting earlier layers carefully, leading to better accuracy.