0
0
PyTorchml~5 mins

Model parameters inspection in PyTorch

Choose your learning style9 modes available
Introduction

We look at model parameters to understand what the model has learned and to check its size. This helps us know how complex the model is.

When you want to see the weights and biases of a neural network layer.
To check if the model parameters are updating during training.
To count how many parameters your model has to estimate its size.
When debugging to ensure layers are connected correctly.
To save or load specific parts of a model.
Syntax
PyTorch
for name, param in model.named_parameters():
    print(name, param.shape, param.requires_grad)

model.named_parameters() gives you the name and value of each parameter.

param.shape shows the size of the parameter tensor.

Examples
This prints only the names of all parameters in the model.
PyTorch
for name, param in model.named_parameters():
    print(name)
This prints the actual values (weights) stored in each parameter.
PyTorch
for name, param in model.named_parameters():
    print(param.data)
This counts and prints the total number of parameters in the model.
PyTorch
total_params = sum(param.numel() for param in model.parameters())
print(f"Total parameters: {total_params}")
Sample Model

This code creates a small neural network with two layers. It then prints each parameter's name, shape, and if it will update during training. Finally, it counts all parameters.

PyTorch
import torch
import torch.nn as nn

# Define a simple model
class SimpleModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(4, 3)
        self.fc2 = nn.Linear(3, 1)

    def forward(self, x):
        x = self.fc1(x)
        x = torch.relu(x)
        x = self.fc2(x)
        return x

model = SimpleModel()

# Inspect parameters
for name, param in model.named_parameters():
    print(f"{name}: shape={param.shape}, requires_grad={param.requires_grad}")

# Count total parameters
total_params = sum(param.numel() for param in model.parameters())
print(f"Total parameters: {total_params}")
OutputSuccess
Important Notes

Parameters with requires_grad=True will be updated during training.

You can freeze parameters by setting param.requires_grad = False.

Inspecting parameters helps in debugging and understanding model size.

Summary

Model parameters include weights and biases that the model learns.

You can list parameters with their names and shapes using named_parameters().

Counting parameters helps estimate model complexity and memory needs.