0
0
TensorFlowml~12 mins

Batch size and epochs in TensorFlow - Model Pipeline Trace

Choose your learning style9 modes available
Model Pipeline - Batch size and epochs

This pipeline shows how batch size and epochs affect training a simple neural network. Batch size controls how many samples the model sees before updating. Epochs control how many times the model sees the whole dataset.

Data Flow - 4 Stages
1Data Loading
1000 rows x 10 columnsLoad dataset with 1000 samples and 10 features each1000 rows x 10 columns
[[0.5, 1.2, ..., 0.3], [0.1, 0.4, ..., 0.9], ...]
2Train/Test Split
1000 rows x 10 columnsSplit data into 800 training and 200 testing samples800 rows x 10 columns (train), 200 rows x 10 columns (test)
Train sample: [0.5, 1.2, ..., 0.3], Test sample: [0.2, 0.7, ..., 0.1]
3Batching
800 rows x 10 columnsDivide training data into batches of 100 samples8 batches x 100 rows x 10 columns
Batch 1: 100 samples, Batch 2: 100 samples, ...
4Model Training
Batch of 100 rows x 10 columnsTrain model on each batch, repeat for 5 epochsModel weights updated after each batch
Batch 1 training updates weights, then Batch 2, ... repeat 5 times
Training Trace - Epoch by Epoch
Loss
0.7 |****
0.6 |*** 
0.5 |**  
0.4 |**  
0.3 |*   
0.2 |    
     1 2 3 4 5 Epochs
EpochLoss ↓Accuracy ↑Observation
10.650.60Model starts learning with moderate loss and accuracy
20.450.75Loss decreases and accuracy improves as model learns
30.350.82Model continues to improve with more epochs
40.300.86Loss decreases steadily, accuracy rises
50.280.88Training converges with low loss and high accuracy
Prediction Trace - 3 Layers
Layer 1: Input Layer
Layer 2: Hidden Layer (ReLU)
Layer 3: Output Layer (Sigmoid)
Model Quiz - 3 Questions
Test your understanding
What does increasing the batch size do during training?
AIncreases the number of epochs
BReduces the number of features
CProcesses more samples before updating model weights
DChanges the model architecture
Key Insight
Batch size controls how many samples the model learns from before updating weights, affecting training speed and stability. Epochs control how many times the model sees the entire dataset, allowing it to improve gradually. Together, they balance learning quality and training time.