Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--model and --rescale-with-baseline can not be used together #71

Closed
L-Zhe opened this issue Aug 4, 2020 · 4 comments
Closed

--model and --rescale-with-baseline can not be used together #71

L-Zhe opened this issue Aug 4, 2020 · 4 comments

Comments

@L-Zhe
Copy link

L-Zhe commented Aug 4, 2020

I use --model to load local model and I use --lang en --rescale-with-baseline to use rescale-with-baseline. The output shows :(hug_trans=3.0.2)-rescaled P: R: F: , but I found the numerical is same with the model without rescale-with-baseline. The command I used is as follow:
bert-score -r XXX -c XXX --model XXX --num_layers 17 --lang en --rescale-with-baseline

@L-Zhe L-Zhe changed the title --model and --rescale-with-baseline can not use together --model and --rescale-with-baseline can not be used together Aug 4, 2020
@Tiiiger
Copy link
Owner

Tiiiger commented Aug 5, 2020

hi @L-Zhe because we couldn't precompute the baselines for your local model, you cannot do rescale-with-baseline. This is an intended feature.

@L-Zhe
Copy link
Author

L-Zhe commented Aug 7, 2020

hi @L-Zhe because we couldn't precompute the baselines for your local model, you cannot do rescale-with-baseline. This is an intended feature.

Because of network limination, downloading the model directly is very slow, so I have to use the local model downloaded from BaiduPan. How could I do to use rescale with baseline?

@Tiiiger
Copy link
Owner

Tiiiger commented Aug 10, 2020

hi @L-Zhe, I see the issue here. One workaround would be to put your downloaded model into the huggingface cache folder. You might open an issue to the transformers repo to see how to do this. Here's a. relevant repo huggingface/transformers#2323 .

A quick but less principled hack is to change the model type within this code branch. You would need to clone this repo, modify the code, and install locally from source.

@Tiiiger Tiiiger closed this as completed Aug 10, 2020
@felixgwu
Copy link
Collaborator

Hi @L-Zhe, we just added a --baseline_path argument to specify a custom path of the baseline file in our master branch.
For example, you can just set it to this file if you use roberta-large for English sentences.
We hope this helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants