0
0
ML Pythonml~20 mins

Privacy considerations in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Privacy Protector
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Understanding Differential Privacy
Which of the following best describes the main goal of differential privacy in machine learning?
ATo speed up the training process by reducing data size
BTo prevent the model from revealing whether any single individual's data was used during training
CTo ensure that the model's predictions are always 100% accurate on training data
DTo increase the complexity of the model to avoid overfitting
Attempts:
2 left
💡 Hint
Think about protecting individual data privacy when training models.
Predict Output
intermediate
2:00remaining
Output of Privacy-Preserving Noise Addition
What is the output of the following Python code that adds Laplace noise to a data point for privacy?
ML Python
import numpy as np
np.random.seed(0)
data_point = 10
noise = np.random.laplace(loc=0.0, scale=1.0)
private_data = data_point + noise
print(round(private_data, 2))
A11.43
B11.76
C11.54
D11.65
Attempts:
2 left
💡 Hint
Run the code to see the exact noise value added.
Model Choice
advanced
2:00remaining
Choosing a Privacy-Preserving Model Technique
You want to train a machine learning model on sensitive health data while ensuring privacy. Which technique is best suited for this?
AUse federated learning to train models locally on devices without sharing raw data
BTrain a deep neural network on raw data without modifications
CCollect all data centrally and anonymize by removing names only
DUse a simple linear regression without any privacy measures
Attempts:
2 left
💡 Hint
Think about methods that keep data on user devices.
Metrics
advanced
2:00remaining
Evaluating Privacy-Utility Tradeoff
In a privacy-preserving model, which metric combination best reflects a good balance between privacy and model usefulness?
AHigh epsilon value in differential privacy and low model accuracy
BHigh epsilon value in differential privacy and high model accuracy
CLow epsilon value in differential privacy and high model accuracy
DLow epsilon value in differential privacy and low model accuracy
Attempts:
2 left
💡 Hint
Lower epsilon means stronger privacy.
🔧 Debug
expert
3:00remaining
Debugging Privacy Leakage in Model Training
You trained a model with differential privacy but found it still leaks some private info. Which code snippet below is the cause?
ML Python
def train_model(data, epsilon):
    # Missing noise addition step
    model = SomeModel()
    model.fit(data)
    return model
AThe model uses federated learning, which always leaks data
BThe model uses too much noise, causing privacy leakage
CThe model trains on encrypted data, which causes leakage
DThe model does not add noise to gradients or parameters, so privacy is not enforced
Attempts:
2 left
💡 Hint
Differential privacy requires adding noise during training.