Which factor most directly increases the energy consumption when training large AI models?
Think about what makes the computer work harder during training.
More training data means the computer processes more information, which uses more energy.
What is a common environmental impact caused by training large AI models?
Consider what happens when computers use a lot of electricity generated from fossil fuels.
Training AI models requires a lot of electricity, often produced by burning fossil fuels, which releases carbon dioxide.
Which approach best helps reduce the environmental cost of training AI models?
Think about ways to do the same work but use less power.
Efficient hardware and better algorithms reduce the energy needed, lowering environmental impact.
Given two AI models, Model X trained on a small dataset for many epochs, and Model Y trained on a large dataset for fewer epochs, which likely has a higher environmental cost?
Consider how data size and number of epochs affect total computation.
Processing a larger dataset, even with fewer epochs, usually requires more total computation and energy than many epochs on a small dataset.
Why might a company choose to train a larger AI model despite its higher environmental cost?
Think about the benefits companies seek from AI models.
Larger models often deliver better results, which can justify their higher energy use despite environmental concerns.