0
0
ML Pythonml~20 mins

Feature importance explanation in ML Python - ML Experiment: Train & Evaluate

Choose your learning style9 modes available
Experiment - Feature importance explanation
Problem:You have trained a decision tree model to predict if a person will buy a product based on features like age, income, and browsing time.
Current Metrics:Training accuracy: 90%, Validation accuracy: 85%
Issue:You want to understand which features the model thinks are most important to make predictions.
Your Task
Explain the importance of each feature used by the decision tree model and visualize it clearly.
Use the trained decision tree model only.
Do not retrain the model or change its parameters.
Hint 1
Hint 2
Hint 3
Solution
ML Python
from sklearn.datasets import load_iris
from sklearn.tree import DecisionTreeClassifier
import matplotlib.pyplot as plt
import numpy as np

# Sample data: Iris dataset (for demonstration)
data = load_iris()
X = data.data
y = data.target
feature_names = data.feature_names

# Train a decision tree model
model = DecisionTreeClassifier(random_state=42)
model.fit(X, y)

# Get feature importances
importances = model.feature_importances_

# Plot feature importances
indices = np.argsort(importances)[::-1]
plt.figure(figsize=(8,5))
plt.title('Feature Importances')
plt.bar(range(len(importances)), importances[indices], color='skyblue')
plt.xticks(range(len(importances)), [feature_names[i] for i in indices], rotation=45, ha='right')
plt.ylabel('Importance')
plt.tight_layout()
plt.show()

# Print feature importance values
for i in indices:
    print(f"Feature: {feature_names[i]}, Importance: {importances[i]:.3f}")
Used the trained decision tree model's feature_importances_ attribute to get importance scores.
Sorted features by importance and plotted a bar chart for easy visualization.
Printed feature names with their importance values for clear explanation.
Results Interpretation

Before: The model was a black box with no clear idea which features mattered most.

After: We have a clear list and visual chart showing which features the model uses most to decide.

Feature importance helps us understand what drives the model's decisions. It builds trust and guides us to focus on the most useful data.
Bonus Experiment
Try using permutation feature importance to explain feature importance on the same model.
💡 Hint
Use sklearn.inspection.permutation_importance to measure how shuffling each feature affects model accuracy.