Skip to content

Divergence and Metrics #1612

Answered by talmo
jramborger78 asked this question in Help!
Discussion options

You must be logged in to vote

Hi @jramborger78,

Hey guys, I had two new questions:

1.) Is there a rule-of-thumb in terms of when to stop a training early after the train and validation loss diverge? Something like "if it has been X number of epochs then after X difference in loss, stop training", up to now i have been kind of intuitively flying by the seat of my pants on it.

Yes! And actually, SLEAP will do this by default so there's nothing you should have to worry about. By default it'll reduce the learning rate and/or stop training early if the validation loss stops improving for a certain number of epochs. The default values are specified in the training configs and will result in early stopping when the validat…

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by talmo
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Help!
Labels
None yet
2 participants