-
Notifications
You must be signed in to change notification settings - Fork 225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
--model and --rescale-with-baseline can not be used together #71
Comments
hi @L-Zhe because we couldn't precompute the baselines for your local model, you cannot do |
Because of network limination, downloading the model directly is very slow, so I have to use the local model downloaded from BaiduPan. How could I do to use rescale with baseline? |
hi @L-Zhe, I see the issue here. One workaround would be to put your downloaded model into the huggingface cache folder. You might open an issue to the transformers repo to see how to do this. Here's a. relevant repo huggingface/transformers#2323 . A quick but less principled hack is to change the model type within this code branch. You would need to clone this repo, modify the code, and install locally from source. |
I use --model to load local model and I use --lang en --rescale-with-baseline to use rescale-with-baseline. The output shows :(hug_trans=3.0.2)-rescaled P: R: F: , but I found the numerical is same with the model without rescale-with-baseline. The command I used is as follow:
bert-score -r XXX -c XXX --model XXX --num_layers 17 --lang en --rescale-with-baseline
The text was updated successfully, but these errors were encountered: