When we talk about learning rate strategy and convergence, the key metric to watch is training loss. This tells us how well the model is learning step-by-step. If the loss goes down smoothly, the learning rate is helping the model find better answers. If the loss jumps around or stays high, the learning rate might be too big or too small, stopping the model from learning well.
Besides loss, validation loss is important to check if the model is truly improving or just memorizing. A good learning rate strategy helps both training and validation loss go down steadily.