We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It seems like with my 2 GPU training Something-v1 dataset takes ~3.5 days for 60 epochs.
Do we need to train 60 epochs to get desirable results or it can be obtained with fewer epochs?
Could you tell me please, can we get your results with less number of epochs(f/e, 30-40)? Did you try that? Cause training time is too long
Could you share your .log file please if it is possible?
P/s: I am training num_segments=8 case
The text was updated successfully, but these errors were encountered:
No branches or pull requests
It seems like with my 2 GPU training Something-v1 dataset takes ~3.5 days for 60 epochs.
Do we need to train 60 epochs to get desirable results or it can be obtained with fewer epochs?
Could you tell me please, can we get your results with less number of epochs(f/e, 30-40)? Did you try that? Cause training time is too long
Could you share your .log file please if it is possible?
P/s: I am training num_segments=8 case
The text was updated successfully, but these errors were encountered: