You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
inspect_lm_huggingface.py now has an option to repeat [MASK] tokens but this doesn't work due to huggingface/transformers#3609
We could implement our own solution using AutoModelWithLMHead, following suggestions in my comment in the above transformer issue, or implement a solution inside the transformer library and make a PR.
Also look at FitBERT, SpanBERT and other tools that may already have implemented this.
inspect_lm_huggingface.py
now has an option to repeat[MASK]
tokens but this doesn't work due to huggingface/transformers#3609We could implement our own solution using
AutoModelWithLMHead
, following suggestions in my comment in the above transformer issue, or implement a solution inside the transformer library and make a PR.Also look at FitBERT, SpanBERT and other tools that may already have implemented this.
Meng et al. 2022 Rewire-then-Probe: A Contrastive Recipe for Probing Biomedical Knowledge of Pre-trained Language Models propose a workaround for obtaining multi-token answers from BERT.
Edit:
The text was updated successfully, but these errors were encountered: