0
0
ML Pythonml~20 mins

Multi-class classification in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Multi-class Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Understanding the output shape of a multi-class classification model
You build a neural network for a multi-class classification problem with 5 classes. The last layer uses a softmax activation. What should be the shape of the output layer?
AFive neurons with softmax activation
BA single neuron with sigmoid activation
CFive neurons with sigmoid activation
DA single neuron with softmax activation
Attempts:
2 left
💡 Hint
Think about how the model outputs probabilities for each class.
Metrics
intermediate
2:00remaining
Choosing the right metric for multi-class classification
You trained a multi-class classifier with 4 classes. Which metric is best to evaluate overall model performance when classes are imbalanced?
APrecision for class 1 only
BMacro-averaged F1 score
CAccuracy
DMean Squared Error
Attempts:
2 left
💡 Hint
Consider a metric that treats all classes equally regardless of their frequency.
Predict Output
advanced
2:00remaining
Output of softmax probabilities for multi-class prediction
What is the output of the following Python code snippet?
ML Python
import numpy as np
from scipy.special import softmax

logits = np.array([2.0, 1.0, 0.1])
probabilities = softmax(logits)
print(probabilities)
A[0.73105858 0.19661193 0.07232949]
B[0.50000000 0.30000000 0.20000000]
C[0.65900114 0.24243297 0.09856589]
D[0.33333333 0.33333333 0.33333333]
Attempts:
2 left
💡 Hint
Softmax converts logits to probabilities that sum to 1.
Hyperparameter
advanced
2:00remaining
Choosing the right loss function for multi-class classification
You are training a neural network for a 3-class classification problem. Which loss function should you use to train the model effectively?
ACategorical Cross-Entropy
BBinary Cross-Entropy
CHinge Loss
DMean Squared Error
Attempts:
2 left
💡 Hint
Consider a loss function designed for multi-class problems with one-hot labels.
🔧 Debug
expert
2:00remaining
Identifying the cause of poor multi-class classification accuracy
You trained a multi-class classifier with 6 classes. The training accuracy is high, but test accuracy is very low. Which of the following is the most likely cause?
AIncorrect use of sigmoid activation in output layer
BModel is underfitting due to too few epochs
CUsing categorical cross-entropy loss with one-hot labels
DModel is overfitting due to lack of regularization
Attempts:
2 left
💡 Hint
High training accuracy but low test accuracy usually means the model memorizes training data.