-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rope on all Llama models for arbitrarily long inputs #2036
Labels
good first issue
Good for newcomers
Comments
Transformers updated by #2011. FastChat/fastchat/model/model_adapter.py Lines 528 to 547 in 8d8c96c
|
@surak feel free to loop me in. |
@DachengLi1 I have no clue how to do so :-) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
As the latest transformers library is a requirement and the pull request below is in it, we can use the latest rope setting:
huggingface/transformers#24653
Here is the documentation for it: https://huggingface.co/docs/transformers/main/en/model_doc/llama#transformers.LlamaConfig.rope_scaling
The text was updated successfully, but these errors were encountered: