0
0
PytorchHow-ToBeginner · 3 min read

How to Use nn.ReLU in PyTorch: Simple Guide with Examples

In PyTorch, nn.ReLU is used as an activation function that replaces negative values with zero, keeping positive values unchanged. You can use it by creating an instance with nn.ReLU() and applying it to tensors or inside neural network layers.
📐

Syntax

The nn.ReLU class creates a ReLU activation function object. You can use it in two ways: as a module in a neural network or as a function applied to tensors.

  • Creating ReLU: relu = nn.ReLU()
  • Applying ReLU: output = relu(input_tensor)

This replaces all negative values in input_tensor with zero and keeps positive values unchanged.

python
import torch
import torch.nn as nn

relu = nn.ReLU()
input_tensor = torch.tensor([-1.0, 0.0, 1.0, 2.0])
output = relu(input_tensor)
print(output)
Output
tensor([0., 0., 1., 2.])
💻

Example

This example shows how to use nn.ReLU inside a simple neural network layer and apply it to input data.

python
import torch
import torch.nn as nn

class SimpleNet(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(3, 3)
        self.relu = nn.ReLU()

    def forward(self, x):
        x = self.linear(x)
        x = self.relu(x)
        return x

net = SimpleNet()
input_data = torch.tensor([[1.0, -1.0, 0.5]])
output = net(input_data)
print(output)
Output
tensor([[0.0000, 0.0000, 0.0000]], grad_fn=<ReluBackward0>)
⚠️

Common Pitfalls

1. Forgetting to create an instance: You must create relu = nn.ReLU() before using it. Calling nn.ReLU(input) directly is incorrect.

2. Using inplace=True without care: nn.ReLU(inplace=True) modifies data in place, which can cause issues if you need the original data later.

3. Confusing nn.ReLU with functional API: torch.nn.functional.relu is a function, while nn.ReLU is a module. Use them accordingly.

python
import torch
import torch.nn as nn

# Wrong: calling nn.ReLU with input
try:
    output = nn.ReLU(torch.tensor([-1.0, 2.0]))
except TypeError as e:
    print(f"Error: {e}")

# Right: create instance then call
relu = nn.ReLU()
output = relu(torch.tensor([-1.0, 2.0]))
print(output)
Output
Error: __init__() takes 1 positional argument but 2 were given tensor([0., 2.])
📊

Quick Reference

UsageDescription
nn.ReLU()Creates a ReLU activation module
relu(input_tensor)Applies ReLU to input tensor, zeroing negatives
nn.ReLU(inplace=True)Applies ReLU modifying input tensor in place
torch.nn.functional.relu(input)Functional API alternative to nn.ReLU module

Key Takeaways

Create an instance of nn.ReLU() before applying it to tensors.
nn.ReLU replaces negative values with zero and keeps positives unchanged.
Use inplace=True carefully to avoid unwanted data modification.
nn.ReLU is a module; torch.nn.functional.relu is a function alternative.
Apply nn.ReLU inside neural network layers for non-linear activation.