Experiment - requires_grad flag
Problem:You have a simple neural network model in PyTorch. The model trains well, but you want to freeze some layers so they do not update during training. Currently, all layers update their weights.
Current Metrics:Training loss decreases from 1.0 to 0.1 over 10 epochs, training accuracy reaches 95%.
Issue:The model updates all parameters, including those you want to keep fixed. This wastes computation and may cause overfitting.