You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time.
#5
Open
shaoeric opened this issue
Sep 13, 2020
· 3 comments
when updating the sub network, is there any need to retain graph like loss.backward(retain_graph=True)
because when i reproduce the procedure, the code runs wrong, but i dont know if retaining the graph is a correct operation
The text was updated successfully, but these errors were encountered:
@shaoeric When I ran this code, this error didn't appear. But it seems like to be correct if your code does report this error. There may indeed be some bugs in this code because it is not under maintenance. The core of the code is the implementation of KL divergence loss and I verified that this part is correct.
when updating the sub network, is there any need to retain graph like
loss.backward(retain_graph=True)
because when i reproduce the procedure, the code runs wrong, but i dont know if retaining the graph is a correct operation
The text was updated successfully, but these errors were encountered: