You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the simplest case, if the flow is a zero tensor, all source elements are copied to the same position in the target tensor. If flow changes slightly, the target tensor will also change, that means the flow gradient is non-zero.
However, this test code shows otherwise:
from Forward_Warp import forward_warp
import torch
a = torch.randn(1,1,5,5)
flow = torch.zeros(1,5,5,2, requires_grad=True)
fwarp = forward_warp()
b =fwarp(a, flow)
b.sum().backward()
print(flow.grad)
# flow.grad is an all-zero tensor.
Any idea what's the issue? Thanks.
The text was updated successfully, but these errors were encountered:
askerlee
changed the title
All-zero grad of flow
Grad of all-zero flow
May 3, 2022
In the simplest case, if the flow is a zero tensor, all source elements are copied to the same position in the target tensor. If flow changes slightly, the target tensor will also change, that means the flow gradient is non-zero.
However, this test code shows otherwise:
Any idea what's the issue? Thanks.
The text was updated successfully, but these errors were encountered: