0
0
ML Pythonml~5 mins

Forward propagation in ML Python

Choose your learning style9 modes available
Introduction
Forward propagation helps a model make predictions by passing input data through its layers step-by-step.
When you want to predict the output for new data using a trained model.
When you need to understand how input features affect the model's output.
When you want to calculate the output values during training before adjusting the model.
When debugging or visualizing how data moves through a neural network.
When implementing a simple neural network from scratch to learn how it works.
Syntax
ML Python
output = activation_function(weighted_sum(inputs, weights) + bias)
Each layer computes a weighted sum of inputs plus a bias, then applies an activation function.
Forward propagation moves from input layer through hidden layers to output layer.
Examples
Calculate weighted sum z and apply sigmoid activation to get output.
ML Python
z = w1 * x1 + w2 * x2 + b
output = sigmoid(z)
Forward pass through two layers: first with ReLU, second with Softmax activation.
ML Python
layer1_output = relu(np.dot(inputs, weights1) + bias1)
final_output = softmax(np.dot(layer1_output, weights2) + bias2)
Sample Model
This code shows a simple forward propagation for one neuron with two inputs. It calculates the weighted sum and applies the sigmoid activation to get the output.
ML Python
import numpy as np

def sigmoid(x):
    return 1 / (1 + np.exp(-x))

# Input features
inputs = np.array([0.5, 0.3])

# Weights and bias for one neuron
weights = np.array([0.4, 0.7])
bias = 0.1

# Forward propagation step
z = np.dot(inputs, weights) + bias
output = sigmoid(z)

print(f"Weighted sum (z): {z:.4f}")
print(f"Output after sigmoid: {output:.4f}")
OutputSuccess
Important Notes
Activation functions like sigmoid, ReLU, or softmax add non-linearity so the model can learn complex patterns.
Forward propagation is just the first step; training adjusts weights based on errors found after this step.
Keep track of shapes of inputs and weights to avoid errors during matrix multiplication.
Summary
Forward propagation passes input data through the model to get predictions.
It involves weighted sums and activation functions at each layer.
Understanding forward propagation helps you see how models make decisions.