0
0
PyTorchml~5 mins

requires_grad flag in PyTorch

Choose your learning style9 modes available
Introduction

The requires_grad flag tells PyTorch if it should track changes to a tensor so it can learn from them. This helps the model improve by adjusting values during training.

When you want to train a model and update its weights automatically.
When you want to freeze some parts of a model so they don't change during training.
When you want to calculate gradients for optimization in neural networks.
When you want to save memory by not tracking gradients for certain tensors.
When you want to do inference only and don't need to update model parameters.
Syntax
PyTorch
tensor = torch.tensor(data, requires_grad=True)  # or requires_grad=False

Set requires_grad=True if you want PyTorch to track operations on this tensor.

Set requires_grad=False to stop tracking and save memory.

Examples
This creates a tensor x that PyTorch will track for gradients.
PyTorch
import torch
x = torch.tensor([1.0, 2.0, 3.0], requires_grad=True)
This creates a tensor y that will not track gradients.
PyTorch
y = torch.tensor([4.0, 5.0, 6.0], requires_grad=False)
This changes the existing tensor x to stop tracking gradients.
PyTorch
x.requires_grad_(False)
Sample Model

This program creates a tensor x that tracks gradients. It computes a simple function and then calculates gradients of z with respect to x. The gradients show how much z changes if x changes.

PyTorch
import torch

# Create a tensor with requires_grad=True
x = torch.tensor([2.0, 3.0], requires_grad=True)

# Define a simple function y = x^2 + 3x
y = x**2 + 3*x

# Compute the sum to get a scalar output
z = y.sum()

# Compute gradients
z.backward()

# Print gradients of x
print(x.grad)
OutputSuccess
Important Notes

Gradients are only computed for tensors with requires_grad=True.

You can turn off gradient tracking temporarily using with torch.no_grad(): for inference.

Changing requires_grad after tensor creation can be done with tensor.requires_grad_(True or False).

Summary

requires_grad tells PyTorch to track operations for learning.

Set it to True for tensors you want to update during training.

Set it to False to save memory or freeze parts of a model.