When using optimizers like SGD or Adam, the key metric to watch is the training loss. This tells us how well the model is learning from data step-by-step. A good optimizer helps the loss go down smoothly and quickly.
We also look at validation loss and accuracy to check if the model is improving on new data, not just memorizing training data.