Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix rope_theta arg for diffusers_attention #4656

Merged
merged 2 commits into from
Nov 8, 2023
Merged

Conversation

lekurile
Copy link
Contributor

@lekurile lekurile commented Nov 8, 2023

This PR updates diffusers_attention to properly pass the rope_theta arg to the linear_func calls. This was added in GH-4480 and needed to be updated for the diffusers attention module as well.

@mrwyattii mrwyattii merged commit 00757a1 into master Nov 8, 2023
15 checks passed
@mrwyattii mrwyattii deleted the lekurile/fix_rope_theta branch November 8, 2023 20:08
mauryaavinash95 pushed a commit to mauryaavinash95/DeepSpeed that referenced this pull request Feb 17, 2024
This PR updates `diffusers_attention` to properly pass the `rope_theta`
arg to the `linear_func` calls. This was added in microsoftGH-4480 and needed to
be updated for the diffusers attention module as well.

Co-authored-by: Michael Wyatt <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants