0
0
PyTorchml~12 mins

Activation functions (ReLU, Sigmoid, Softmax) in PyTorch - Model Pipeline Trace

Choose your learning style9 modes available
Model Pipeline - Activation functions (ReLU, Sigmoid, Softmax)

This pipeline shows how activation functions like ReLU, Sigmoid, and Softmax transform data inside a simple neural network. These functions help the model learn by changing numbers in ways that make sense for different tasks.

Data Flow - 4 Stages
1Input Layer
1 row x 4 columnsRaw input features representing 4 values1 row x 4 columns
[2.0, -1.0, 0.5, 3.0]
2Hidden Layer with ReLU
1 row x 4 columnsApply ReLU activation (replaces negatives with zero)1 row x 4 columns
[2.0, 0.0, 0.5, 3.0]
3Hidden Layer with Sigmoid
1 row x 4 columnsApply Sigmoid activation (squashes values between 0 and 1)1 row x 4 columns
[0.88, 0.5, 0.62, 0.95]
4Output Layer with Softmax
1 row x 4 columnsApply Softmax activation (converts values to probabilities summing to 1)1 row x 4 columns
[0.31, 0.18, 0.22, 0.29]
Training Trace - Epoch by Epoch

Loss
0.9 |****
0.8 |*** 
0.7 |**  
0.6 |**  
0.5 |*   
0.4 |*   
0.3 |    
     1 2 3 4 5 Epochs
EpochLoss ↓Accuracy ↑Observation
10.850.40Loss starts high, accuracy low as model begins learning
20.650.60Loss decreases, accuracy improves as activations help learning
30.500.75Model learns better patterns, activations shape outputs
40.400.82Loss continues to drop, accuracy rises steadily
50.350.87Training converges with good accuracy and low loss
Prediction Trace - 4 Layers
Layer 1: Input Layer
Layer 2: ReLU Activation
Layer 3: Sigmoid Activation
Layer 4: Softmax Activation
Model Quiz - 3 Questions
Test your understanding
What does the ReLU activation do to negative input values?
AChanges them to one
BLeaves them unchanged
CChanges them to zero
DConverts them to probabilities
Key Insight
Activation functions transform raw numbers into forms that help the model learn and make decisions. ReLU removes negatives, Sigmoid squashes values between 0 and 1, and Softmax turns outputs into probabilities for classification.