0
0
PyTorchml~5 mins

Dropout (nn.Dropout) in PyTorch

Choose your learning style9 modes available
Introduction

Dropout helps a neural network avoid overfitting by randomly turning off some neurons during training. This makes the model more reliable on new data.

When training a neural network to prevent it from memorizing the training data.
When you want your model to generalize better to unseen examples.
When your model is very complex and tends to overfit.
When you want to improve the robustness of your neural network.
Syntax
PyTorch
torch.nn.Dropout(p=0.5, inplace=False)

p is the probability of dropping a neuron (turning it off).

inplace decides if the operation modifies the input directly or returns a new tensor.

Examples
Creates a dropout layer that randomly drops 30% of neurons during training.
PyTorch
dropout = torch.nn.Dropout(p=0.3)
Creates a dropout layer that drops 70% of neurons and modifies the input tensor directly.
PyTorch
dropout = torch.nn.Dropout(p=0.7, inplace=True)
Sample Model

This code shows how dropout randomly turns off neurons during training mode but leaves them unchanged during evaluation mode.

PyTorch
import torch
import torch.nn as nn

# Create dropout layer with 50% dropout rate
dropout = nn.Dropout(p=0.5)

# Input tensor simulating activations from previous layer
input_tensor = torch.tensor([[1.0, 2.0, 3.0, 4.0],
                             [5.0, 6.0, 7.0, 8.0]])

# Set dropout to training mode to activate dropout
dropout.train()
output_train = dropout(input_tensor)

# Set dropout to evaluation mode to turn off dropout
dropout.eval()
output_eval = dropout(input_tensor)

print("Input Tensor:")
print(input_tensor)
print("\nOutput with Dropout (training mode):")
print(output_train)
print("\nOutput without Dropout (eval mode):")
print(output_eval)
OutputSuccess
Important Notes

Dropout only works during training. During evaluation, it passes data unchanged.

The output during training scales the remaining neurons to keep the overall signal strength.

Summary

Dropout randomly disables neurons during training to reduce overfitting.

Use nn.Dropout with a probability p to set dropout rate.

Remember to switch between training and evaluation modes to control dropout behavior.