0
0
TensorFlowml~20 mins

Precision-recall curves in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Precision-Recall Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
1:30remaining
Understanding Precision-Recall Curve Purpose

What does a precision-recall curve primarily help you understand about a classification model?

AThe speed at which the model makes predictions.
BThe overall accuracy of the model across all classes.
CThe trade-off between the model's ability to identify positive cases and its accuracy on those predictions.
DThe model's performance on negative class predictions only.
Attempts:
2 left
💡 Hint

Think about what precision and recall measure in classification.

Predict Output
intermediate
2:00remaining
Output of TensorFlow Precision-Recall Calculation

What is the output shape of the precision and recall tensors when using TensorFlow's tf.metrics.PrecisionAtRecall with a batch of 5 predictions?

TensorFlow
import tensorflow as tf

labels = tf.constant([1, 0, 1, 1, 0], dtype=tf.int32)
predictions = tf.constant([0.9, 0.1, 0.8, 0.4, 0.3], dtype=tf.float32)

metric = tf.keras.metrics.PrecisionAtRecall(recall=0.8)
metric.update_state(labels, predictions)
result = metric.result()
print(result.shape)
A() # Scalar tensor representing precision value
B(5,) # Tensor with precision for each prediction
C(2,) # Tensor with precision and recall values
D(None,) # Tensor with undefined shape
Attempts:
2 left
💡 Hint

Check the documentation for PrecisionAtRecall output.

Model Choice
advanced
1:30remaining
Choosing Model Output for Precision-Recall Curve

Which type of model output is most appropriate to generate a precision-recall curve?

ARaw logits (unnormalized scores) from the last layer
BProbability scores (values between 0 and 1) for the positive class
CBinary class labels (0 or 1) predicted by the model
DOne-hot encoded vectors of predicted classes
Attempts:
2 left
💡 Hint

Precision-recall curves require a threshold to vary. Which output allows this?

Metrics
advanced
1:30remaining
Interpreting Area Under Precision-Recall Curve (AUPRC)

What does a higher Area Under the Precision-Recall Curve (AUPRC) indicate about a model's performance?

AThe model has faster training time.
BThe model has higher overall accuracy across all classes.
CThe model predicts negative classes more accurately than positive classes.
DThe model has better balance between precision and recall, especially on imbalanced datasets.
Attempts:
2 left
💡 Hint

Think about what precision and recall measure and why AUPRC is useful.

🔧 Debug
expert
2:30remaining
Debugging Precision-Recall Curve Calculation in TensorFlow

Given the following code snippet, what is the cause of the error when trying to compute precision-recall curve?

import tensorflow as tf

labels = tf.constant([1, 0, 1, 0, 1])
predictions = tf.constant([0.9, 0.2, 0.8, 0.4, 0.3])

precision, recall, thresholds = tf.metrics.precision_recall_curve(labels, predictions)
print(precision, recall, thresholds)
A<code>tf.metrics.precision_recall_curve</code> does not exist in TensorFlow; the correct function is in <code>sklearn.metrics</code>.
BThe labels tensor must be float32, not int32.
CThe predictions tensor must be binary labels, not probabilities.
DThe function requires labels and predictions to be lists, not tensors.
Attempts:
2 left
💡 Hint

Check TensorFlow API for precision-recall curve functions.