You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
if not if_accum_grad:
if gradient_mask_tensor is not None:##################gradient_mask_tensor一直为None
for name, param in net.named_parameters():
if name in gradient_mask_tensor:
param.grad = param.grad * gradient_mask_tensor[name]
optimizer.step()###############每次只有第二项会mask
optimizer.zero_grad()
acc, acc5 = torch_accuracy(pred, label, (1,5))
The text was updated successfully, but these errors were encountered:
for compactor_param, mask in compactor_mask_dict.items():####################单独设置compactor的梯度信息,加上lasso_grad梯度
compactor_param.grad.data = mask * compactor_param.grad.data
lasso_grad = compactor_param.data * ((compactor_param.data ** 2).sum(dim=(1, 2, 3), keepdim=True) ** (-0.5))###########这个mask是乘以的loss第二项,和论文不同
compactor_param.grad.data.add_(resrep_config.lasso_strength, lasso_grad)
The text was updated successfully, but these errors were encountered: