You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, when I loaded the detectron_100_resnet_most_data model, some problems happened:
While copying the parameter named "module.image_embedding_models_list.0.0.image_attention_model.modal_combine.Fa_image.main.0.weight_g", whose dimensions in the model are torch.Size([]) and whose dimensions in the checkpoint are torch.Size([1]). While copying the parameter named "module.image_embedding_models_list.0.0.image_attention_model.modal_combine.Fa_txt.main.0.weight_g", whose dimensions in the model are torch.Size([]) and whose dimensions in the checkpoint are torch.Size([1]).
I think might weight_norm caused it, but I don't know the reason. Can you give me some advices.
The text was updated successfully, but these errors were encountered:
@YuJiang01 pytorch 0.4.0. I have found the problem, because my pytorch creates a different type for weight_norm parameter. Just need to convert single element tensor to scalar type by use a = a[0].
1) Add support Bert Embedding Extraction, (#12)
2) Generic pretrained embedding loading
3) Restructure base_model to abstract away text embedding init
4) Question id is now passed in data sample meta info
Hi, when I loaded the detectron_100_resnet_most_data model, some problems happened:
While copying the parameter named "module.image_embedding_models_list.0.0.image_attention_model.modal_combine.Fa_image.main.0.weight_g", whose dimensions in the model are torch.Size([]) and whose dimensions in the checkpoint are torch.Size([1]). While copying the parameter named "module.image_embedding_models_list.0.0.image_attention_model.modal_combine.Fa_txt.main.0.weight_g", whose dimensions in the model are torch.Size([]) and whose dimensions in the checkpoint are torch.Size([1]).
I think might weight_norm caused it, but I don't know the reason. Can you give me some advices.
The text was updated successfully, but these errors were encountered: