You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The epoch count is reset whenever the round is incremented, is it expected? When I train wikitext I get the following sequence: round 0 - epoch 0 - batch 0 round 0 - epoch 0 - batch 1 [...] round 0 - epoch 0 - batch 5 round 0 - epoch 1 - batch 0 [...] round 0 - epoch 1 - batch 5 round 0 - epoch 2 - batch 0 round 1 - epoch 0 - batch 0 round 1 - epoch 0 - batch 1
Shouldn't the epoch continue incrementing rather than being reset? (Clearly there needs to be more explanations in the UI regarding rounds epochs and batches #691).
I was thinking of a ~descending relation, translated in words as "each round, run X epoches. each epoch, run Y batches.". so every round/epoch restart its lower ~layer.
Hm as a user I would like to know how many epochs the model has been training so far (without having to do a multiplication and remember how rounds translate into epochs).
Is it even useful to show the round number actually? Could we rather only show (like with an animation or something) when communication is happening but not necessarily show the round number?
The text was updated successfully, but these errors were encountered: