Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
FIX Prefix tuning test w/ rotary emb on multi GPU (#2311)
See huggingface/transformers#35235 (comment) for context. There has been a refactor in transformers that resulted in the rotary embedding of Mistral (and probably others) moving to the model level. This led to a device map used in one of the tests to being incorrect. This PR fixes the device map. Note that this fix doesn't really have anything to do with prefix tuning, the error occurred even before prefix tuning is used.
- Loading branch information