A multi-class classification model helps us teach a computer to tell which category something belongs to when there are more than two choices.
0
0
Multi-class classification model in TensorFlow
Introduction
Sorting emails into categories like work, personal, or spam.
Recognizing handwritten digits from 0 to 9.
Classifying types of fruits in pictures like apples, bananas, or oranges.
Detecting the type of animal in a photo among cats, dogs, and birds.
Syntax
TensorFlow
model = tf.keras.Sequential([
tf.keras.layers.Dense(units, activation='relu', input_shape=(input_features,)),
tf.keras.layers.Dense(number_of_classes, activation='softmax')
])
model.compile(
optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy']
)The last layer uses softmax to give probabilities for each class.
Use sparse_categorical_crossentropy loss when labels are integers representing classes.
Examples
This example builds a model for 3 classes with 4 input features.
TensorFlow
model = tf.keras.Sequential([
tf.keras.layers.Dense(16, activation='relu', input_shape=(4,)),
tf.keras.layers.Dense(3, activation='softmax')
])
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])This example is for 5 classes with 10 input features.
TensorFlow
model = tf.keras.Sequential([
tf.keras.layers.Dense(32, activation='relu', input_shape=(10,)),
tf.keras.layers.Dense(5, activation='softmax')
])
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])Sample Model
This program trains a simple multi-class model on small sample data with 3 classes. It then predicts classes for new samples.
TensorFlow
import tensorflow as tf import numpy as np # Sample data: 6 samples, 4 features each X_train = np.array([ [5.1, 3.5, 1.4, 0.2], [7.0, 3.2, 4.7, 1.4], [6.3, 3.3, 6.0, 2.5], [5.0, 3.6, 1.4, 0.2], [6.7, 3.1, 4.4, 1.4], [7.6, 3.0, 6.6, 2.1] ]) # Labels: 3 classes (0, 1, 2) y_train = np.array([0, 1, 2, 0, 1, 2]) # Build model model = tf.keras.Sequential([ tf.keras.layers.Dense(10, activation='relu', input_shape=(4,)), tf.keras.layers.Dense(3, activation='softmax') ]) # Compile model model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) # Train model history = model.fit(X_train, y_train, epochs=10, verbose=0) # Predict on new data X_new = np.array([ [5.9, 3.0, 5.1, 1.8], [5.0, 3.4, 1.5, 0.2] ]) predictions = model.predict(X_new) predicted_classes = predictions.argmax(axis=1) print(f"Training accuracy after 10 epochs: {history.history['accuracy'][-1]:.2f}") print(f"Predicted classes for new samples: {predicted_classes.tolist()}")
OutputSuccess
Important Notes
Make sure your labels are integers starting from 0 for sparse categorical loss.
Softmax outputs probabilities that add up to 1 for each sample.
More epochs usually improve accuracy but watch out for overfitting.
Summary
Multi-class models classify inputs into more than two categories.
Use softmax activation in the last layer to get class probabilities.
Use sparse_categorical_crossentropy loss when labels are integer class IDs.