Challenge - 5 Problems
Multi-class Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate2:00remaining
Understanding the output shape of a multi-class classification model
You build a neural network for a multi-class classification problem with 5 classes. The last layer uses a softmax activation. What should be the shape of the output layer?
Attempts:
2 left
💡 Hint
Think about how the model outputs probabilities for each class.
✗ Incorrect
For multi-class classification, the output layer should have as many neurons as classes. Softmax activation converts outputs into probabilities that sum to 1.
❓ Metrics
intermediate2:00remaining
Choosing the right metric for multi-class classification
You trained a multi-class classifier with 4 classes. Which metric is best to evaluate overall model performance when classes are imbalanced?
Attempts:
2 left
💡 Hint
Consider a metric that treats all classes equally regardless of their frequency.
✗ Incorrect
Macro-averaged F1 score calculates F1 for each class and averages them, giving equal importance to all classes, which is good for imbalanced data.
❓ Predict Output
advanced2:00remaining
Output of softmax probabilities for multi-class prediction
What is the output of the following Python code snippet?
ML Python
import numpy as np from scipy.special import softmax logits = np.array([2.0, 1.0, 0.1]) probabilities = softmax(logits) print(probabilities)
Attempts:
2 left
💡 Hint
Softmax converts logits to probabilities that sum to 1.
✗ Incorrect
The softmax function exponentiates each logit and normalizes by the sum of all exponentials. The output sums to 1 and reflects relative confidence.
❓ Hyperparameter
advanced2:00remaining
Choosing the right loss function for multi-class classification
You are training a neural network for a 3-class classification problem. Which loss function should you use to train the model effectively?
Attempts:
2 left
💡 Hint
Consider a loss function designed for multi-class problems with one-hot labels.
✗ Incorrect
Categorical Cross-Entropy is the standard loss for multi-class classification with one-hot encoded labels and softmax output.
🔧 Debug
expert2:00remaining
Identifying the cause of poor multi-class classification accuracy
You trained a multi-class classifier with 6 classes. The training accuracy is high, but test accuracy is very low. Which of the following is the most likely cause?
Attempts:
2 left
💡 Hint
High training accuracy but low test accuracy usually means the model memorizes training data.
✗ Incorrect
Overfitting happens when the model learns training data too well but fails to generalize. Lack of regularization can cause this.