0
0
ML Pythonml~12 mins

Activation functions in ML Python - Model Pipeline Trace

Choose your learning style9 modes available
Model Pipeline - Activation functions

This pipeline shows how activation functions help a neural network learn by adding non-linear behavior. They transform the data after each layer so the model can understand complex patterns.

Data Flow - 5 Stages
1Input data
1 sample x 3 featuresRaw input features fed into the model1 sample x 3 features
[0.5, -1.2, 3.3]
2Linear transformation
1 sample x 3 featuresMultiply inputs by weights and add bias1 sample x 2 neurons
[1.1, -0.7]
3Activation function (ReLU)
1 sample x 2 neuronsApply ReLU to keep positive values, zero out negatives1 sample x 2 neurons
[1.1, 0.0]
4Linear transformation
1 sample x 2 neuronsMultiply by next layer weights and add bias1 sample x 2 neurons
[0.3, -0.4]
5Activation function (Sigmoid)
1 sample x 2 neuronsApply sigmoid to squash values between 0 and 11 sample x 2 neurons
[0.574, 0.401]
Training Trace - Epoch by Epoch
Loss
0.7 |****
0.6 |*** 
0.5 |**  
0.4 |*   
0.3 |*   
    +-----
    1 2 3 4 5 Epochs
EpochLoss ↓Accuracy ↑Observation
10.650.60Loss starts high, accuracy is low as model begins learning
20.500.72Loss decreases, accuracy improves as activations help model learn
30.400.80Model learns better patterns, activation functions enable non-linear learning
40.350.85Loss continues to drop, accuracy rises steadily
50.300.88Training converges, activation functions help model capture complex data
Prediction Trace - 5 Layers
Layer 1: Input layer
Layer 2: First linear layer
Layer 3: ReLU activation
Layer 4: Second linear layer
Layer 5: Sigmoid activation
Model Quiz - 3 Questions
Test your understanding
What does the ReLU activation function do to negative input values?
AChanges them to zero
BKeeps them the same
CChanges them to one
DInverts their sign
Key Insight
Activation functions add important non-linear transformations that let neural networks learn complex patterns beyond simple straight lines. Without them, models would be limited to only simple relationships.