0
0
TensorFlowml~12 mins

First neural network in TensorFlow - Model Pipeline Trace

Choose your learning style9 modes available
Model Pipeline - First neural network

This pipeline shows how a simple neural network learns to classify handwritten digits from images. It starts with raw image data, processes it, trains a small neural network, and then makes predictions.

Data Flow - 5 Stages
1Load Data
70000 rows x 28 x 28 pixelsLoad MNIST handwritten digits dataset70000 rows x 28 x 28 pixels
Image of digit '5' as 28x28 grayscale pixels
2Preprocessing
70000 rows x 28 x 28 pixelsNormalize pixel values from 0-255 to 0-170000 rows x 28 x 28 pixels
Pixel value 0 becomes 0.0, 255 becomes 1.0
3Flatten Images
70000 rows x 28 x 28 pixelsConvert 2D images to 1D vectors70000 rows x 784 columns
28x28 image becomes a list of 784 pixel values
4Train/Test Split
70000 rows x 784 columnsSplit data into training (60000) and test (10000) setsTraining: 60000 rows x 784 columns, Test: 10000 rows x 784 columns
Training set image vector and label for digit '3'
5Model Training
60000 rows x 784 columnsTrain neural network with one hidden layerTrained model with weights and biases
Model learns to map input vectors to digit labels
Training Trace - Epoch by Epoch
Loss
0.5 |****
0.4 |***
0.3 |**
0.2 |*
0.1 |
    +---------
     1 2 3 4 5 Epochs
EpochLoss ↓Accuracy ↑Observation
10.450.87Model starts learning, accuracy improves quickly
20.300.92Loss decreases, accuracy increases as model fits data
30.250.94Model continues to improve with more training
40.220.95Loss decreases steadily, accuracy nears 95%
50.200.96Training converges with high accuracy
Prediction Trace - 4 Layers
Layer 1: Input Layer
Layer 2: Hidden Layer (Dense + ReLU)
Layer 3: Output Layer (Dense + Softmax)
Layer 4: Prediction
Model Quiz - 3 Questions
Test your understanding
What does the 'Flatten Images' step do to the data?
AConverts 2D images into 1D vectors
BSplits data into training and test sets
CNormalizes pixel values between 0 and 1
DApplies activation functions to neurons
Key Insight
A simple neural network learns by adjusting weights to reduce error (loss) and improve accuracy. Flattening images and normalizing pixels prepare data for the model. Activation functions like ReLU and softmax help the network learn complex patterns and output probabilities.