How to Use nn.ReLU in PyTorch: Simple Guide with Examples
nn.ReLU is used as an activation function that replaces negative values with zero, keeping positive values unchanged. You can use it by creating an instance with nn.ReLU() and applying it to tensors or inside neural network layers.Syntax
The nn.ReLU class creates a ReLU activation function object. You can use it in two ways: as a module in a neural network or as a function applied to tensors.
- Creating ReLU:
relu = nn.ReLU() - Applying ReLU:
output = relu(input_tensor)
This replaces all negative values in input_tensor with zero and keeps positive values unchanged.
import torch import torch.nn as nn relu = nn.ReLU() input_tensor = torch.tensor([-1.0, 0.0, 1.0, 2.0]) output = relu(input_tensor) print(output)
Example
This example shows how to use nn.ReLU inside a simple neural network layer and apply it to input data.
import torch import torch.nn as nn class SimpleNet(nn.Module): def __init__(self): super().__init__() self.linear = nn.Linear(3, 3) self.relu = nn.ReLU() def forward(self, x): x = self.linear(x) x = self.relu(x) return x net = SimpleNet() input_data = torch.tensor([[1.0, -1.0, 0.5]]) output = net(input_data) print(output)
Common Pitfalls
1. Forgetting to create an instance: You must create relu = nn.ReLU() before using it. Calling nn.ReLU(input) directly is incorrect.
2. Using inplace=True without care: nn.ReLU(inplace=True) modifies data in place, which can cause issues if you need the original data later.
3. Confusing nn.ReLU with functional API: torch.nn.functional.relu is a function, while nn.ReLU is a module. Use them accordingly.
import torch import torch.nn as nn # Wrong: calling nn.ReLU with input try: output = nn.ReLU(torch.tensor([-1.0, 2.0])) except TypeError as e: print(f"Error: {e}") # Right: create instance then call relu = nn.ReLU() output = relu(torch.tensor([-1.0, 2.0])) print(output)
Quick Reference
| Usage | Description |
|---|---|
| nn.ReLU() | Creates a ReLU activation module |
| relu(input_tensor) | Applies ReLU to input tensor, zeroing negatives |
| nn.ReLU(inplace=True) | Applies ReLU modifying input tensor in place |
| torch.nn.functional.relu(input) | Functional API alternative to nn.ReLU module |