0
0
TensorFlowml~20 mins

Loss functions (MSE, cross-entropy) in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Loss Function Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of MSE loss calculation
What is the output of the Mean Squared Error (MSE) loss calculation for the given predictions and true values?
TensorFlow
import tensorflow as tf

y_true = tf.constant([1.0, 2.0, 3.0])
y_pred = tf.constant([1.5, 2.5, 2.0])

mse = tf.keras.losses.MeanSquaredError()
loss_value = mse(y_true, y_pred).numpy()
print(loss_value)
A0.41666666
B0.33333334
C0.5
D0.25
Attempts:
2 left
💡 Hint
Recall MSE is the average of squared differences between true and predicted values.
🧠 Conceptual
intermediate
1:30remaining
Choosing loss function for binary classification
Which loss function is most appropriate for a binary classification problem with outputs as probabilities?
ABinary Cross-Entropy
BCategorical Cross-Entropy
CHinge Loss
DMean Squared Error (MSE)
Attempts:
2 left
💡 Hint
Binary classification outputs probabilities between 0 and 1 for two classes.
Metrics
advanced
1:30remaining
Interpreting cross-entropy loss value
Given a binary classification model outputting probabilities, what does a cross-entropy loss value close to 0 indicate?
AThe model predictions are very close to the true labels
BThe model predictions are random guesses
CThe model predictions are completely wrong
DThe model is overfitting
Attempts:
2 left
💡 Hint
Cross-entropy loss measures the difference between predicted probabilities and true labels.
🔧 Debug
advanced
2:00remaining
Error in cross-entropy loss with logits
What error will occur when using tf.keras.losses.BinaryCrossentropy(from_logits=False) on raw logits instead of probabilities?
TensorFlow
import tensorflow as tf

loss_fn = tf.keras.losses.BinaryCrossentropy(from_logits=False)
logits = tf.constant([0.0, 2.0, -1.0])
labels = tf.constant([0, 1, 0], dtype=tf.float32)
loss = loss_fn(labels, logits).numpy()
print(loss)
ATypeError because labels and logits have different types
BValueError due to logits not being probabilities
CNo error, loss computed correctly
DRuntimeWarning about invalid values in loss
Attempts:
2 left
💡 Hint
Check the from_logits parameter and input values expected.
Model Choice
expert
2:30remaining
Selecting loss function for multi-class classification with logits
You have a multi-class classification problem with 5 classes. Your model outputs raw logits (not probabilities). Which loss function and parameter setting is correct to use in TensorFlow?
Atf.keras.losses.CategoricalCrossentropy(from_logits=False)
Btf.keras.losses.BinaryCrossentropy(from_logits=True)
Ctf.keras.losses.SparseCategoricalCrossentropy(from_logits=False)
Dtf.keras.losses.CategoricalCrossentropy(from_logits=True)
Attempts:
2 left
💡 Hint
Multi-class with logits requires categorical cross-entropy with from_logits=True.