0
0
TensorFlowml~5 mins

Loss functions (MSE, cross-entropy) in TensorFlow

Choose your learning style9 modes available
Introduction

Loss functions tell us how wrong our model's predictions are. They help the model learn by showing it what to fix.

When training a model to predict numbers, like house prices.
When training a model to classify images into categories, like cats or dogs.
When you want to measure how well your model fits the data during training.
When comparing different models to pick the best one.
When tuning model settings to improve accuracy.
Syntax
TensorFlow
loss = tf.keras.losses.MeanSquaredError()
# or
loss = tf.keras.losses.CategoricalCrossentropy()

# To use in model compile:
model.compile(optimizer='adam', loss=loss, metrics=['accuracy'])

MSE (Mean Squared Error) is used for regression problems where output is continuous.

Cross-entropy is used for classification problems where output is categories.

Examples
This calculates the MSE between two sets of numbers.
TensorFlow
mse_loss = tf.keras.losses.MeanSquaredError()
result = mse_loss([3.0, 5.0], [2.5, 5.5]).numpy()
This calculates cross-entropy loss for one-hot encoded class labels.
TensorFlow
cross_entropy = tf.keras.losses.CategoricalCrossentropy()
result = cross_entropy(
  [[0, 1, 0]], [[0.05, 0.9, 0.05]]).numpy()
Sample Model

This program calculates and prints the MSE loss for regression data and cross-entropy loss for classification data using TensorFlow.

TensorFlow
import tensorflow as tf

# Sample data for regression
true_values = [[3.0], [5.0], [7.0]]
predictions = [[2.5], [5.5], [6.0]]

mse_loss = tf.keras.losses.MeanSquaredError()
mse_value = mse_loss(true_values, predictions).numpy()

# Sample data for classification
true_labels = [[0, 1, 0], [1, 0, 0]]  # one-hot
pred_probs = [[0.05, 0.9, 0.05], [0.8, 0.1, 0.1]]
cross_entropy = tf.keras.losses.CategoricalCrossentropy()
ce_value = cross_entropy(true_labels, pred_probs).numpy()

print(f"MSE Loss: {mse_value:.4f}")
print(f"Cross-Entropy Loss: {ce_value:.4f}")
OutputSuccess
Important Notes

Always choose MSE for problems predicting numbers and cross-entropy for classification tasks.

Cross-entropy expects probabilities as predictions, so use softmax activation in the last layer for classification.

Loss values get smaller as the model improves during training.

Summary

Loss functions measure how bad the model's predictions are.

MSE is for number predictions; cross-entropy is for categories.

Use loss functions to guide the model to learn better.