When using optimizers like SGD, Adam, or RMSprop, the key metric to watch is the training loss. This shows how well the model is learning to fit the data. A lower loss means the optimizer is helping the model improve. We also look at validation loss to check if the model is learning patterns that work on new data, not just memorizing the training data.
Accuracy is important too, but loss gives a clearer picture of how the optimizer is guiding the model's learning step by step.