0
0
PyTorchml~5 mins

Forward pass, loss, backward, step in PyTorch

Choose your learning style9 modes available
Introduction

We use these steps to teach a computer how to learn from data by guessing, checking mistakes, and improving.

When training a model to recognize images like cats and dogs.
When teaching a robot to understand spoken commands.
When building a system to predict house prices from features.
When improving a chatbot to give better answers.
When adjusting a model to classify emails as spam or not.
Syntax
PyTorch
output = model(input)
loss = loss_function(output, target)
loss.backward()
optimizer.step()
optimizer.zero_grad()

output = model(input) runs the data through the model to get predictions.

loss.backward() calculates how to change the model to reduce mistakes.

Examples
Basic training step with input x and target y.
PyTorch
output = model(x)
loss = criterion(output, y)
loss.backward()
optimizer.step()
optimizer.zero_grad()
Training step for image data with predictions and labels.
PyTorch
predictions = model(images)
loss = loss_fn(predictions, labels)
loss.backward()
optimizer.step()
optimizer.zero_grad()
Sample Model

This code creates a simple model that learns to predict targets from inputs. It shows the loss before and after one training step.

PyTorch
import torch
import torch.nn as nn
import torch.optim as optim

# Simple model: one linear layer
class SimpleModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(2, 1)
    def forward(self, x):
        return self.linear(x)

# Data: inputs and targets
inputs = torch.tensor([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]])
targets = torch.tensor([[3.0], [7.0], [11.0]])

# Model, loss, optimizer
model = SimpleModel()
criterion = nn.MSELoss()
optimizer = optim.SGD(model.parameters(), lr=0.01)

# Training step
output = model(inputs)
loss = criterion(output, targets)
print(f"Loss before backward: {loss.item():.4f}")

loss.backward()
optimizer.step()
optimizer.zero_grad()

# Check loss after one step
output2 = model(inputs)
loss2 = criterion(output2, targets)
print(f"Loss after one step: {loss2.item():.4f}")
OutputSuccess
Important Notes

Always call optimizer.zero_grad() before loss.backward() to clear old gradients.

The loss.backward() step calculates gradients automatically for all model parameters.

The optimizer.step() updates the model weights using the gradients.

Summary

The forward pass runs data through the model to get predictions.

The loss measures how far predictions are from targets.

Backward computes gradients to know how to improve the model.

Step updates the model weights to reduce future mistakes.