0
0
PytorchComparisonBeginner · 4 min read

nn.Module vs nn.functional in PyTorch: Key Differences and Usage

In PyTorch, nn.Module is a class used to create layers or models with built-in parameter management, while nn.functional provides stateless functions for operations like activations and convolutions. Use nn.Module when you want reusable layers with parameters, and nn.functional for direct function calls without parameter storage.
⚖️

Quick Comparison

This table summarizes the main differences between nn.Module and nn.functional in PyTorch.

Aspectnn.Modulenn.functional
TypeClass with parameters and stateStateless functions
Parameter ManagementAutomatically stores weights and biasesNo parameter storage; requires manual handling
UsageDefine layers or modelsApply operations like activations or convolutions directly
ReusabilityReusable as objects with saved parametersUsed inline without saving state
Examplenn.Linear, nn.Conv2dF.relu, F.conv2d
FlexibilityLess flexible but easier for standard layersMore flexible for custom operations
⚖️

Key Differences

nn.Module is a base class for all neural network layers and models in PyTorch. It manages parameters like weights and biases automatically, making it easy to build reusable layers. When you create a layer like nn.Linear, it inherits from nn.Module and stores its parameters internally.

On the other hand, nn.functional is a module that contains functions for operations such as activation functions, convolutions, and pooling. These functions do not store any parameters themselves; you must provide all inputs explicitly. This makes nn.functional more flexible for custom operations but requires manual parameter management.

In practice, nn.Module is used to define layers and models that hold parameters and can be reused easily, while nn.functional is often used inside nn.Module implementations to perform the actual computations.

⚖️

Code Comparison

Here is an example of using nn.Module to create a simple linear layer with ReLU activation.

python
import torch
import torch.nn as nn

class SimpleModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(3, 2)  # Layer with parameters
        self.relu = nn.ReLU()          # Activation as a module

    def forward(self, x):
        x = self.linear(x)
        x = self.relu(x)
        return x

model = SimpleModel()
input_tensor = torch.tensor([[1.0, 2.0, 3.0]])
output = model(input_tensor)
print(output)
Output
tensor([[0.0000, 0.0000]], grad_fn=<ReluBackward0>)
↔️

nn.functional Equivalent

The same model can be implemented using nn.functional for activation, while still using nn.Module for the linear layer to hold parameters.

python
import torch
import torch.nn as nn
import torch.nn.functional as F

class SimpleModelFunctional(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(3, 2)  # Layer with parameters

    def forward(self, x):
        x = self.linear(x)
        x = F.relu(x)  # Functional activation
        return x

model = SimpleModelFunctional()
input_tensor = torch.tensor([[1.0, 2.0, 3.0]])
output = model(input_tensor)
print(output)
Output
tensor([[0.0000, 0.0000]], grad_fn=<ReluBackward0>)
🎯

When to Use Which

Choose nn.Module when you want to create layers or models that manage their own parameters and can be reused easily. It simplifies building complex models by encapsulating parameters and operations.

Choose nn.functional when you need more control over operations or want to apply functions without creating new layer objects. It is useful for custom computations inside nn.Module or when you want to avoid extra objects.

In most cases, combine both: use nn.Module for layers and nn.functional for operations like activations inside those layers.

Key Takeaways

nn.Module manages parameters and is used to build layers and models.
nn.functional provides stateless functions for operations without parameter storage.
Use nn.Module for reusable layers and nn.functional for flexible, inline operations.
Combining both is common: layers as nn.Module, activations as nn.functional.
Choosing depends on whether you need parameter management or just function application.