Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mask和论文不符吗? #20

Open
sdreamforchen opened this issue May 26, 2022 · 2 comments
Open

mask和论文不符吗? #20

sdreamforchen opened this issue May 26, 2022 · 2 comments

Comments

@sdreamforchen
Copy link

for compactor_param, mask in compactor_mask_dict.items():####################单独设置compactor的梯度信息,加上lasso_grad梯度
compactor_param.grad.data = mask * compactor_param.grad.data
lasso_grad = compactor_param.data * ((compactor_param.data ** 2).sum(dim=(1, 2, 3), keepdim=True) ** (-0.5))###########这个mask是乘以的loss第二项,和论文不同
compactor_param.grad.data.add_(resrep_config.lasso_strength, lasso_grad)

if not if_accum_grad:
    if gradient_mask_tensor is not None:##################gradient_mask_tensor一直为None
        for name, param in net.named_parameters():
            if name in gradient_mask_tensor:
                param.grad = param.grad * gradient_mask_tensor[name]
    optimizer.step()###############每次只有第二项会mask
    optimizer.zero_grad()
acc, acc5 = torch_accuracy(pred, label, (1,5))
@Lutong-Qin
Copy link

你好,请问你说的什么意思,我没大看明白,我看代码我觉得代码写的没问题呀

@yannqi
Copy link

yannqi commented Mar 3, 2023

是相符合的,没有不符。一开始被你也带跑了,跟着又回顾了下代码。重点看下述代码的注释:

for compactor_param, mask in compactor_mask_dict.items():
        compactor_param.grad.data = mask * compactor_param.grad.data  # equ 14, 对应公式第一项,Compactor参数损失梯度* Mask
        lasso_grad = compactor_param.data * ((compactor_param.data ** 2).sum(dim=(1, 2, 3), keepdim=True) ** (-0.5))equ 14, 对应公式第二项,Group Lasso 梯度的计算。
        compactor_param.grad.data.add_(resrep_config.lasso_strength, lasso_grad)  # equ 14,对应公式第二项, Group Lasso梯度*\lambda

个人感觉,你应该是把 compactor_param.grad.datacompactor_param.data 区分混淆了,注意.grad.的存在。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants