0
0
PyTorchml~20 mins

PyTorch ecosystem overview - ML Experiment: Train & Evaluate

Choose your learning style9 modes available
Experiment - PyTorch ecosystem overview
Problem:You want to understand the PyTorch ecosystem and how its components work together to build and train machine learning models.
Current Metrics:No model or code yet; just conceptual understanding.
Issue:Lack of practical experience with PyTorch components like tensors, datasets, dataloaders, models, optimizers, and training loops.
Your Task
Build a simple PyTorch model using core ecosystem components and observe training metrics to understand their roles.
Use PyTorch tensors, Dataset, DataLoader, nn.Module, optimizer, and training loop.
Keep the model simple (e.g., linear regression).
Print training loss for each epoch.
Hint 1
Hint 2
Hint 3
Hint 4
Hint 5
Solution
PyTorch
import torch
from torch.utils.data import Dataset, DataLoader
import torch.nn as nn
import torch.optim as optim

# Create a simple dataset
class SimpleDataset(Dataset):
    def __init__(self):
        # Features: x values from 0 to 9
        self.x = torch.arange(10, dtype=torch.float32).unsqueeze(1)
        # Targets: y = 2*x + 1
        self.y = 2 * self.x + 1
    def __len__(self):
        return len(self.x)
    def __getitem__(self, idx):
        return self.x[idx], self.y[idx]

# Define a simple linear model
class LinearModel(nn.Module):
    def __init__(self):
        super(LinearModel, self).__init__()
        self.linear = nn.Linear(1, 1)
    def forward(self, x):
        return self.linear(x)

# Prepare dataset and dataloader
dataset = SimpleDataset()
dataloader = DataLoader(dataset, batch_size=2, shuffle=True)

# Initialize model, loss function, and optimizer
model = LinearModel()
criterion = nn.MSELoss()
optimizer = optim.SGD(model.parameters(), lr=0.01)

# Training loop
for epoch in range(10):
    total_loss = 0
    for inputs, targets in dataloader:
        optimizer.zero_grad()
        outputs = model(inputs)
        loss = criterion(outputs, targets)
        loss.backward()
        optimizer.step()
        total_loss += loss.item() * inputs.size(0)
    avg_loss = total_loss / len(dataset)
    print(f"Epoch {epoch+1}: Loss = {avg_loss:.4f}")
Created a custom Dataset to hold simple linear data.
Used DataLoader to batch and shuffle data.
Defined a linear model subclassing nn.Module.
Used MSELoss as the loss function.
Used SGD optimizer with learning rate 0.01.
Implemented a training loop printing average loss per epoch.
Fixed super() call in LinearModel to use super(LinearModel, self).__init__() for compatibility.
Results Interpretation

Before: No model or training, no metrics.

After: Model trains successfully with loss decreasing each epoch, showing PyTorch components working together.

This experiment shows how PyTorch ecosystem parts like Dataset, DataLoader, nn.Module, optimizer, and training loop combine to build and train a model.
Bonus Experiment
Try replacing the SGD optimizer with Adam optimizer and observe the training loss changes.
💡 Hint
Use torch.optim.Adam with a learning rate of 0.01 and compare loss decrease speed.