0
0
PyTorchml~20 mins

Why nn.Module organizes model code in PyTorch - Experiment to Prove It

Choose your learning style9 modes available
Experiment - Why nn.Module organizes model code
Problem:You want to build a neural network model in PyTorch but your code is messy and hard to manage. You notice that using nn.Module helps organize the model code better.
Current Metrics:No specific accuracy metric yet, just messy code with repeated parts and unclear structure.
Issue:The model code is not organized, making it hard to reuse, debug, and extend. Without nn.Module, you manually write forward passes and parameter management, which is error-prone.
Your Task
Rewrite a simple neural network using nn.Module to organize the code clearly and correctly.
Use PyTorch and nn.Module class.
Define layers in __init__ and computation in forward method.
Do not change the model architecture or dataset.
Hint 1
Hint 2
Hint 3
Hint 4
Solution
PyTorch
import torch
import torch.nn as nn
import torch.optim as optim

# Define a simple neural network using nn.Module
class SimpleNN(nn.Module):
    def __init__(self):
        super(SimpleNN, self).__init__()
        self.fc1 = nn.Linear(10, 20)  # input layer to hidden
        self.relu = nn.ReLU()
        self.fc2 = nn.Linear(20, 2)   # hidden to output

    def forward(self, x):
        x = self.fc1(x)
        x = self.relu(x)
        x = self.fc2(x)
        return x

# Create model instance
model = SimpleNN()

# Dummy input and target
X = torch.randn(5, 10)  # batch of 5 samples, 10 features each
Y = torch.randint(0, 2, (5,))  # batch of 5 labels (0 or 1)

# Loss and optimizer
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(model.parameters(), lr=0.1)

# Training step
model.train()
optimizer.zero_grad()
outputs = model(X)
loss = criterion(outputs, Y)
loss.backward()
optimizer.step()

# Print loss to confirm code runs
print(f"Training loss: {loss.item():.4f}")
Created a class SimpleNN inheriting from nn.Module.
Moved layer definitions to __init__ method.
Implemented forward method for data flow.
Used model.parameters() for optimizer to manage weights automatically.
Changed super().__init__() to super(SimpleNN, self).__init__() for compatibility.
Results Interpretation

Before: Model code was scattered with manual layer calls and no clear structure.
After: Model code is clean, reusable, and easy to read with nn.Module organizing layers and forward pass.

Using nn.Module helps organize model code by grouping layers and computation in one place. It makes code easier to maintain, reuse, and extend.
Bonus Experiment
Add a dropout layer to the model using nn.Module to reduce overfitting.
💡 Hint
Add nn.Dropout in __init__ and apply it in forward between layers.