-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixing a small issue in trainer logging #1563
Conversation
Unsurprisingly, this broke the logging tests, probably since the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch, LGTM 🤖
I added the change to the tests. Let's see that everything passes now. Two questions:
|
|
Okay cool. The only test that's failing is the PyTorch-v1_5 one that timed out. Does that make sense? As far as I'm concerned it's ready to merge, but if there's a way to return this test, that might be even better. |
@guydav rebased |
Before submitting
What does this PR do?
Fixes half of #1520. This is the simpler half, where the epoch is logged into a dict that isn't passed to the loggers.
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
As much as can be in a one-line fix.