Challenge - 5 Problems
Loss Function Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of MSE loss calculation
What is the output of the Mean Squared Error (MSE) loss calculation for the given predictions and true values?
TensorFlow
import tensorflow as tf y_true = tf.constant([1.0, 2.0, 3.0]) y_pred = tf.constant([1.5, 2.5, 2.0]) mse = tf.keras.losses.MeanSquaredError() loss_value = mse(y_true, y_pred).numpy() print(loss_value)
Attempts:
2 left
💡 Hint
Recall MSE is the average of squared differences between true and predicted values.
✗ Incorrect
MSE is calculated as the average of (y_true - y_pred)^2. Here, differences are [-0.5, -0.5, 1.0], squared are [0.25, 0.25, 1.0], average is (0.25+0.25+1.0)/3 = 0.5.
🧠 Conceptual
intermediate1:30remaining
Choosing loss function for binary classification
Which loss function is most appropriate for a binary classification problem with outputs as probabilities?
Attempts:
2 left
💡 Hint
Binary classification outputs probabilities between 0 and 1 for two classes.
✗ Incorrect
Binary Cross-Entropy is designed for two-class problems with probability outputs. Categorical Cross-Entropy is for multi-class problems. MSE is less suitable for probabilities.
❓ Metrics
advanced1:30remaining
Interpreting cross-entropy loss value
Given a binary classification model outputting probabilities, what does a cross-entropy loss value close to 0 indicate?
Attempts:
2 left
💡 Hint
Cross-entropy loss measures the difference between predicted probabilities and true labels.
✗ Incorrect
A loss close to 0 means predicted probabilities match true labels well, indicating good model performance.
🔧 Debug
advanced2:00remaining
Error in cross-entropy loss with logits
What error will occur when using tf.keras.losses.BinaryCrossentropy(from_logits=False) on raw logits instead of probabilities?
TensorFlow
import tensorflow as tf loss_fn = tf.keras.losses.BinaryCrossentropy(from_logits=False) logits = tf.constant([0.0, 2.0, -1.0]) labels = tf.constant([0, 1, 0], dtype=tf.float32) loss = loss_fn(labels, logits).numpy() print(loss)
Attempts:
2 left
💡 Hint
Check the from_logits parameter and input values expected.
✗ Incorrect
Setting from_logits=False expects probabilities between 0 and 1. Passing raw logits causes ValueError because inputs are out of range.
❓ Model Choice
expert2:30remaining
Selecting loss function for multi-class classification with logits
You have a multi-class classification problem with 5 classes. Your model outputs raw logits (not probabilities). Which loss function and parameter setting is correct to use in TensorFlow?
Attempts:
2 left
💡 Hint
Multi-class with logits requires categorical cross-entropy with from_logits=True.
✗ Incorrect
CategoricalCrossentropy with from_logits=True correctly applies softmax internally to logits. from_logits=False expects probabilities. SparseCategoricalCrossentropy is for integer labels, not one-hot. BinaryCrossentropy is for binary tasks.