You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug unpatch_moduledoes not seem to be working as expected. Once dropouts are enabled I can't switch them off using unpatch_module
To Reproduce
Run this in Colab
!pipinstall-qqbaaltransformersdatasetsimporttorchfromtransformersimportBertForSequenceClassificationfrombaal.bayesian.dropoutimportpatch_module,unpatch_modulepretrained_weights='bert-base-uncased'use_cuda=torch.cuda.is_available()
model=BertForSequenceClassification.from_pretrained(pretrained_model_name_or_path=pretrained_weights)
print(f"Droputs enabled: {model.dropout.training}") # False heremodel=patch_module(model,inplace=False)
print(f"Droputs enabled: {model.dropout.training}") # True heremodel=unpatch_module(model,inplace=False)
print(f"Droputs enabled: {model.dropout.training}") # Should be False here but this is true??# For vanilla NN I used to do the following, not sure if this is relevant for BERT. Any ideas?forminmodel.modules():
ifm.__class__.__name__.startswith('Dropout'):
m.eval()
print(f"Droputs enabled: {model.dropout.training}") # False here
Expected behavior
model=unpatch_module(model,inplace=False)
print(f"Droputs enabled: {model.dropout.training}") # <-- This should return false
Once unpatch_module is run dropouts should not be in training mode
Vesion (please complete the following information):
OS: Colab, Win 11
Python: 3.8.15
Baal version: 1.7.0
Additional context
So based on my understanding while using MCdropout with vanilla Neural nets I frequently used model.dropout.training to make sure if dropouts are enabled or not. If this holds for HF BERT models(which I think it does since it is NN based) then essentially this is a bug.
The text was updated successfully, but these errors were encountered:
Describe the bug
unpatch_module
does not seem to be working as expected. Once dropouts are enabled I can't switch them off usingunpatch_module
To Reproduce
Run this in Colab
Expected behavior
Once
unpatch_module
is run dropouts should not be in training modeVesion (please complete the following information):
Additional context
So based on my understanding while using MCdropout with vanilla Neural nets I frequently used
model.dropout.training
to make sure if dropouts are enabled or not. If this holds for HF BERT models(which I think it does since it is NN based) then essentially this is a bug.The text was updated successfully, but these errors were encountered: