Overview - Label smoothing
What is it?
Label smoothing is a technique used in training machine learning models to make the model less confident about its predictions. Instead of assigning a full probability of 1 to the correct class and 0 to others, it assigns a slightly lower probability to the correct class and distributes the remaining probability among the other classes. This helps the model generalize better and avoid overfitting.
Why it matters
Without label smoothing, models can become too confident about their predictions, which makes them less flexible and more likely to make big mistakes on new data. Label smoothing helps models stay humble and cautious, leading to better performance on real-world tasks where data can be noisy or different from training data.
Where it fits
Before learning label smoothing, you should understand basic classification tasks, how models output probabilities, and loss functions like cross-entropy. After mastering label smoothing, you can explore advanced regularization techniques and calibration methods to improve model reliability.