0
0
ML Pythonml~5 mins

Elastic Net regularization in ML Python

Choose your learning style9 modes available
Introduction
Elastic Net helps a model avoid overfitting by combining two ways to keep it simple and focused on important features.
When you have many features and want to select the most important ones.
When features are correlated and you want to keep groups of related features together.
When you want a balance between removing unimportant features and shrinking coefficients.
When pure Lasso or Ridge regularization alone does not give good results.
When you want to improve model prediction on new data by controlling complexity.
Syntax
ML Python
ElasticNet(alpha=1.0, l1_ratio=0.5, fit_intercept=True, max_iter=1000, random_state=None)
alpha controls overall strength of regularization; higher means more penalty.
l1_ratio controls mix between L1 (Lasso) and L2 (Ridge) penalties; 0 = Ridge, 1 = Lasso.
Examples
More weight on L1 penalty to encourage feature selection.
ML Python
ElasticNet(alpha=0.5, l1_ratio=0.7)
Equivalent to Ridge regression with only L2 penalty.
ML Python
ElasticNet(alpha=1.0, l1_ratio=0.0)
Equivalent to Lasso regression with only L1 penalty.
ML Python
ElasticNet(alpha=1.0, l1_ratio=1.0)
Sample Model
This example creates a dataset, trains an Elastic Net model, and shows how well it predicts new data by printing the error and learned coefficients.
ML Python
from sklearn.linear_model import ElasticNet
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error

# Create sample data with 100 samples and 10 features
X, y = make_regression(n_samples=100, n_features=10, noise=10, random_state=42)

# Split data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Create ElasticNet model with alpha=0.1 and l1_ratio=0.5
model = ElasticNet(alpha=0.1, l1_ratio=0.5, random_state=42)

# Train the model
model.fit(X_train, y_train)

# Predict on test data
predictions = model.predict(X_test)

# Calculate mean squared error
mse = mean_squared_error(y_test, predictions)

print(f"Mean Squared Error: {mse:.2f}")
print(f"Model coefficients: {model.coef_}")
OutputSuccess
Important Notes
Elastic Net is useful when you want both feature selection (like Lasso) and coefficient shrinkage (like Ridge).
Choosing the right alpha and l1_ratio values usually requires trying several options and checking performance.
Elastic Net can handle cases where features are highly correlated better than Lasso alone.
Summary
Elastic Net combines L1 and L2 penalties to keep models simple and stable.
It helps select important features and reduce overfitting.
Adjust alpha and l1_ratio to control the balance between penalties.