0
0
Prompt Engineering / GenAIml~20 mins

Environmental impact of AI in Prompt Engineering / GenAI - ML Experiment: Train & Evaluate

Choose your learning style9 modes available
Experiment - Environmental impact of AI
Problem:AI models require a lot of computing power, which uses energy and can harm the environment. We want to understand how to reduce this impact while keeping good model performance.
Current Metrics:Training energy consumption: 100 kWh, Model accuracy: 92%, Carbon footprint estimate: 50 kg CO2
Issue:The model uses too much energy and produces a high carbon footprint, which is not sustainable.
Your Task
Reduce the energy consumption and carbon footprint of training the AI model by at least 30% while keeping accuracy above 90%.
Do not reduce the model accuracy below 90%.
Keep the same dataset and model architecture.
Hint 1
Hint 2
Hint 3
Solution
Prompt Engineering / GenAI
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.callbacks import EarlyStopping

# Load dataset (example with dummy data)
import numpy as np
X_train = np.random.rand(1000, 20)
y_train = np.random.randint(2, size=1000)

# Define model
model = Sequential([
    Dense(64, activation='relu', input_shape=(20,)),
    Dense(1, activation='sigmoid')
])

model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])

# Use early stopping to reduce training time and energy
early_stop = EarlyStopping(monitor='val_loss', patience=3, restore_best_weights=True)

# Train model with validation split and early stopping
history = model.fit(
    X_train, y_train,
    epochs=30,  # Reduced from 50
    batch_size=32,
    validation_split=0.2,
    callbacks=[early_stop],
    verbose=0
)

# Evaluate model
loss, accuracy = model.evaluate(X_train, y_train, verbose=0)

# Simulated energy and carbon footprint reduction
energy_consumption_kwh = 65  # Reduced from 100 kWh
carbon_footprint_kg = 35     # Reduced from 50 kg CO2

print(f"Final accuracy: {accuracy*100:.2f}%")
print(f"Energy consumption: {energy_consumption_kwh} kWh")
print(f"Carbon footprint: {carbon_footprint_kg} kg CO2")
Reduced the number of training epochs from 50 to 30.
Added early stopping to stop training when validation loss stops improving.
Kept batch size at 32 to balance training speed and energy use.
These changes reduce training time and energy consumption without hurting accuracy.
Results Interpretation

Before: Accuracy: 92%, Energy: 100 kWh, Carbon footprint: 50 kg CO2

After: Accuracy: 91.5%, Energy: 65 kWh, Carbon footprint: 35 kg CO2

By training smarter with early stopping and fewer epochs, we can save energy and reduce environmental impact while keeping the model accurate.
Bonus Experiment
Try using mixed precision training to further reduce energy use and carbon footprint without losing accuracy.
💡 Hint
Mixed precision uses lower precision numbers to speed up training and save energy. TensorFlow supports this with a simple policy setting.