0
0
TensorFlowml~12 mins

Validation split in TensorFlow - Model Pipeline Trace

Choose your learning style9 modes available
Model Pipeline - Validation split

This pipeline shows how a dataset is split into training and validation parts to help the model learn well and check its performance on unseen data during training.

Data Flow - 4 Stages
1Original dataset
1000 rows x 10 columnsInitial dataset with features and labels1000 rows x 10 columns
[[5.1, 3.5, ..., 0], [4.9, 3.0, ..., 1], ...]
2Train/Validation split
1000 rows x 10 columnsSplit dataset into 80% training and 20% validationTraining: 800 rows x 10 columns, Validation: 200 rows x 10 columns
Training sample: [5.1, 3.5, ..., 0], Validation sample: [6.7, 3.1, ..., 1]
3Model training
Training: 800 rows x 10 columnsTrain model on training dataTrained model
Model learns patterns from training samples
4Validation during training
Validation: 200 rows x 10 columnsEvaluate model on validation data each epochValidation loss and accuracy metrics
Validation accuracy improves from 60% to 85%
Training Trace - Epoch by Epoch
Loss
0.7 |****
0.6 |*** 
0.5 |**  
0.4 |*   
0.3 |*   
     1 2 3 4 5 Epochs
EpochLoss ↓Accuracy ↑Observation
10.650.60Model starts learning; loss is high, accuracy low
20.500.72Loss decreases, accuracy improves
30.400.80Model learns better features
40.350.83Training loss decreases steadily
50.300.85Model converges with good accuracy
Prediction Trace - 4 Layers
Layer 1: Input layer
Layer 2: Hidden layer with ReLU
Layer 3: Output layer with Softmax
Layer 4: Prediction
Model Quiz - 3 Questions
Test your understanding
Why do we use a validation split during training?
ATo speed up training by using less data
BTo increase training data size
CTo check model performance on unseen data during training
DTo test the model after training finishes
Key Insight
Using a validation split helps us see how well the model generalizes to new data during training. It prevents overfitting by giving feedback on unseen data, shown by improving validation accuracy and decreasing loss.