0
0
PyTorchml~20 mins

Dynamic computation graph advantage in PyTorch - ML Experiment: Train & Evaluate

Choose your learning style9 modes available
Experiment - Dynamic computation graph advantage
Problem:You want to understand how dynamic computation graphs in PyTorch help when working with inputs of varying sizes or structures.
Current Metrics:N/A - This is a conceptual experiment focusing on model flexibility rather than accuracy metrics.
Issue:Static computation graphs (like in some other frameworks) require fixed input sizes and structures, making it hard to handle variable input lengths or conditional operations.
Your Task
Demonstrate how PyTorch's dynamic computation graph allows building a model that can handle inputs of different lengths and apply conditional operations during forward pass.
Use PyTorch only.
Do not use static graph frameworks.
Show model behavior with inputs of different sizes.
Hint 1
Hint 2
Hint 3
Solution
PyTorch
import torch
import torch.nn as nn

class DynamicModel(nn.Module):
    def __init__(self, input_size, hidden_size):
        super().__init__()
        self.rnn_cell = nn.RNNCell(input_size, hidden_size)
        self.fc = nn.Linear(hidden_size, 1)

    def forward(self, x):
        # x shape: (batch_size, seq_len, input_size)
        batch_size, seq_len, _ = x.size()
        hidden = torch.zeros(batch_size, self.rnn_cell.hidden_size)
        outputs = []
        for t in range(seq_len):
            input_t = x[:, t, :]
            hidden = self.rnn_cell(input_t, hidden)
            # Conditional operation: if mean of hidden > 0, apply ReLU else apply tanh
            if hidden.mean() > 0:
                activated = torch.relu(hidden)
            else:
                activated = torch.tanh(hidden)
            outputs.append(activated)
        # Stack outputs: (seq_len, batch_size, hidden_size) -> (batch_size, seq_len, hidden_size)
        outputs = torch.stack(outputs, dim=1)
        # Take last output for prediction
        last_output = outputs[:, -1, :]
        out = self.fc(last_output)
        return out

# Create model
model = DynamicModel(input_size=3, hidden_size=5)

# Example inputs with different sequence lengths
input1 = torch.randn(2, 4, 3)  # batch=2, seq_len=4, input_size=3
input2 = torch.randn(2, 6, 3)  # batch=2, seq_len=6, input_size=3

# Forward pass with input1
output1 = model(input1)
print(f"Output with seq_len=4: {output1}")

# Forward pass with input2
output2 = model(input2)
print(f"Output with seq_len=6: {output2}")
Implemented a custom RNNCell loop to process variable-length sequences dynamically.
Added a conditional inside the forward pass to apply different activations based on hidden state mean.
Demonstrated model can handle inputs of different sequence lengths without graph rebuilding.
Results Interpretation

Before: Static graph frameworks require fixed input sizes and fixed operations, making it hard to handle variable input lengths or conditional logic.

After: PyTorch's dynamic graph allows the model to run loops and conditionals during the forward pass, adapting to input size and data dynamically.

Dynamic computation graphs let you write models that can change their structure on the fly, making it easier to work with variable input sizes and conditional operations.
Bonus Experiment
Modify the model to handle batches where each sequence has a different length by padding and masking.
💡 Hint
Use PyTorch's pack_padded_sequence and pad_packed_sequence utilities to handle variable-length sequences efficiently.