-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
StableDiffusionXLPipeline does not work when COMPEL=1 #982
Comments
@mudler I am also having this bug is there a fix? |
same error trying to follow the StableDiffusionXLPipeline example |
@lianee Welp, thats a oops, so in your env you need to add |
@lunamidori5 I use the docker image, and with |
I've updated usage of Compel within #1746 with XL models |
LocalAI version:
quay.io/go-skynet/local-ai:latest
Environment, CPU architecture, OS, and Version:
OS: Ubuntu 16.04 LTS
CPU architecture : x86_64
Linux dbcloud 4.4.0-142-generic #168-Ubuntu SMP Wed Jan 16 21:00:45 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
Describe the bug
StableDiffusionXLPipeline
for the Diffusers backend does not work when the environment variable COMPEL=1 (or not set). Instead, it returns:To Reproduce
Simply follow the
Linaqruf/animagine-xl
example in the doc (https://localai.io/model-compatibility/diffusers/#model-setup).Expected behavior
Logs
Additional context
Looks like the problem comes from #904 where Compel is introduced to handle long prompts. The Diffusers API for SDXL requires both
prompt_embeds
andpooled_prompt_embeds
(maybe they'll change that in the future, see huggingface/diffusers#4341), yet the code doesn't cover this scenario.By the way, it would be better if the
.env
file contains an option for the environment variable COMPEL. The above issues could really confuse someone new to LocalAI (like me) when they found that the official example does not work🤔The text was updated successfully, but these errors were encountered: