You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The learning rate might be more a debug information than something to keep track in a training? It's a bit a problem to display too much info per epoch for readability and the learning rate does not really inform about the training like the loss or the metrics?
Since it can fit in the row, it should not be that much of a change. If you prefer I can just not print it, but it's a bit of a hassle to change the code every time 😅 You probably have seen the output in the PR #161
I found it was useful for example when EarlyStop = False, because it can help fine-tune the maximum epoch.
I was wondering whether it would be convenient to print additional information such as the learning rate at each iteration of the training.
This could be useful for understanding how the training is progressing, IMHO.
The text was updated successfully, but these errors were encountered: