Regulatory compliance (GDPR, AI Act) in MLOps - Time & Space Complexity
When managing machine learning workflows, following rules like GDPR and the AI Act is important.
We want to see how the time needed to check compliance grows as data or model size grows.
Analyze the time complexity of the following code snippet.
for record in dataset:
if not check_data_consent(record):
remove_record(record)
else:
log_compliance(record)
for model in deployed_models:
if not validate_model_fairness(model):
alert_team(model)
This code checks each data record for consent and each model for fairness compliance.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Looping through each data record and each deployed model.
- How many times: Once per record in the dataset and once per model in deployed_models.
As the number of data records or models grows, the time to check compliance grows proportionally.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 records/models | About 20 checks |
| 100 records/models | About 200 checks |
| 1000 records/models | About 2000 checks |
Pattern observation: Doubling the data or models roughly doubles the work needed.
Time Complexity: O(n)
This means the time to ensure compliance grows directly with the number of data records and models.
[X] Wrong: "Compliance checks happen instantly regardless of data size."
[OK] Correct: Each record and model must be checked, so more data means more time needed.
Understanding how compliance checks scale helps you design systems that stay efficient as data grows.
"What if compliance checks were done only on new or changed data instead of all data? How would the time complexity change?"