What is Transfer Learning in AI: Simple Explanation and Example
pretrained models to save time and improve performance, especially when data is limited for the new task.How It Works
Imagine you learned to ride a bicycle. Later, learning to ride a motorcycle becomes easier because you already know balance and steering. Transfer learning works similarly in AI. A model trained on a large dataset for one task learns useful features, like recognizing edges or shapes in images.
Instead of starting from scratch, transfer learning takes this pretrained model and fine-tunes it for a new task with less data. This saves time and often leads to better results because the model already understands basic patterns.
Example
This example shows how to use a pretrained image classification model and adapt it to classify a small set of new images.
import tensorflow as tf from tensorflow.keras.applications import MobileNetV2 from tensorflow.keras.layers import Dense, GlobalAveragePooling2D from tensorflow.keras.models import Model # Load pretrained MobileNetV2 without top layer base_model = MobileNetV2(weights='imagenet', include_top=False, input_shape=(96, 96, 3)) base_model.trainable = False # Freeze base model # Add new classification head x = base_model.output x = GlobalAveragePooling2D()(x) x = Dense(128, activation='relu')(x) predictions = Dense(5, activation='softmax')(x) # 5 classes model = Model(inputs=base_model.input, outputs=predictions) # Compile model model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) # Dummy data for demonstration import numpy as np X_train = np.random.rand(20, 96, 96, 3).astype('float32') y_train = np.random.randint(0, 5, 20) # Train only the new layers history = model.fit(X_train, y_train, epochs=3, batch_size=5) # Show training accuracy print(f"Final training accuracy: {history.history['accuracy'][-1]:.2f}")
When to Use
Use transfer learning when you have limited data for a new task but want good results quickly. It is common in image recognition, natural language processing, and speech tasks.
For example, a company can use a pretrained model trained on millions of images to classify specific products with only a few hundred photos. This saves time and computing power.
Key Points
- Transfer learning reuses knowledge from one task to help another.
- It reduces training time and data needs.
- Pretrained models provide a strong starting point.
- Common in image, text, and speech AI applications.