You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, it is really a great method to stabilize the GANs. I am just curious to know that how important the weight average is for stabilizing the GANs (in this case). You have mentioned in the paper that it increases the performance slightly. is it a viable option to remove it completely? I can check it experimentally but just looking for your insights.
The text was updated successfully, but these errors were encountered:
The weight average reduces the noise but it also works without it. That said, for more varied data sets like Imagenet, the noise can be quite big. You can alternatively anneal the learning rate (we did that in an earlier version of this paper), but it's a bit annoying to find a good schedule and we obtained better results with the moving average.
Hi, it is really a great method to stabilize the GANs. I am just curious to know that how important the weight average is for stabilizing the GANs (in this case). You have mentioned in the paper that it increases the performance slightly. is it a viable option to remove it completely? I can check it experimentally but just looking for your insights.
The text was updated successfully, but these errors were encountered: