Complete the code to import the library used for differential privacy in machine learning.
from diffprivlib import [1]
The mechanisms module provides tools to add differential privacy to machine learning models.
Complete the code to add Gaussian noise for privacy to a numeric value.
noisy_value = original_value + [1](0, 1)
random.gauss adds Gaussian noise with mean 0 and standard deviation 1, useful for privacy.
Fix the error in the code to ensure the model does not memorize sensitive data by limiting training epochs.
model.fit(X_train, y_train, epochs=[1])Using fewer epochs like 1 helps prevent overfitting and memorizing sensitive data.
Fill in the blank to create a dictionary comprehension that filters out data points with sensitive attribute value 'Yes'.
filtered_data = {k: v for k, v in data.items() if v[1] 'Yes'}We want to keep data where the value is not equal to 'Yes', so use != 'Yes'.
Fill all three blanks to create a dictionary comprehension that maps user IDs to their masked emails only if their consent is True.
masked_emails = {user[1]: email[2] for user, (email, consent) in users.items() if consent [3] True}The user ID is masked by showing first 3 chars plus stars, emails are converted to lowercase, and we filter only users with consent True.