You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
thank you for your work.
when i train my net in watermark,it gets NaN when the quality setted to be 100.However, it is just a small bug,and not affect the whole.
The text was updated successfully, but these errors were encountered:
I also find this bug. This is because the quality factor will be 0 if you set quality too 100. Then, the factor will be an DIVISION BY ZERO error in quantization. My solution is add 0.0001 to the factor:
quality/100. + 0.0001 in function quality_to_factor.
Of course, if you use a random quality, such as torch.rand(1)*100, this bug is no problem, because the range is [0, 100). In my experiment, I set this value = 100. - torch.rand(1)*50, although the range is (50, 100], since the probability is continuous, the probability of getting to 100 is almost zero.
This is because the quality factor will be 0 if you set quality too 100. Then, the factor will be an DIVISION BY ZERO error in quantization. My solution is add 0.0001 to the factor:
I also find this bug. This is because the quality factor will be 0 if you set quality too 100. Then, the factor will be an DIVISION BY ZERO error in quantization. My solution is add 0.0001 to the factor:
quality/100. + 0.0001 in function quality_to_factor.
Of course, if you use a random quality, such as torch.rand(1)*100, this bug is no problem, because the range is [0, 100). In my experiment, I set this value = 100. - torch.rand(1)*50, although the range is (50, 100], since the probability is continuous, the probability of getting to 100 is almost zero.
thank you for your work.
when i train my net in watermark,it gets NaN when the quality setted to be 100.However, it is just a small bug,and not affect the whole.
The text was updated successfully, but these errors were encountered: