Environmental cost of training AI models in AI for Everyone - Time & Space Complexity
We want to understand how the environmental cost grows as AI models get bigger and training takes longer.
How does the energy and resource use increase when training larger AI models?
Analyze the time complexity of the following AI training process.
for epoch in range(num_epochs):
for batch in training_data:
model.train(batch)
This code trains an AI model by repeating over all data batches multiple times (epochs).
Look at what repeats in the training process.
- Primary operation: Training on each batch of data.
- How many times: Number of epochs times number of batches.
As the dataset or epochs grow, the training steps increase proportionally.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 batches, 5 epochs | 50 training steps |
| 100 batches, 5 epochs | 500 training steps |
| 1000 batches, 10 epochs | 10,000 training steps |
Pattern observation: The total training steps grow directly with both data size and epochs.
Time Complexity: O(n * e)
This means the training time grows in direct proportion to the number of data batches and epochs.
[X] Wrong: "Training time stays the same no matter how much data or epochs we use."
[OK] Correct: More data or more epochs mean more training steps, so energy and time increase accordingly.
Understanding how training time scales helps you explain AI model costs clearly and shows you grasp real-world AI challenges.
"What if we reduce the number of epochs but increase the batch size? How would the time complexity change?"