Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StableDiffusionXLPipeline does not work when COMPEL=1 #982

Closed
ljcleo opened this issue Aug 30, 2023 · 5 comments · Fixed by #1746
Closed

StableDiffusionXLPipeline does not work when COMPEL=1 #982

ljcleo opened this issue Aug 30, 2023 · 5 comments · Fixed by #1746
Assignees
Labels
bug Something isn't working

Comments

@ljcleo
Copy link

ljcleo commented Aug 30, 2023

LocalAI version:
quay.io/go-skynet/local-ai:latest

Environment, CPU architecture, OS, and Version:
OS: Ubuntu 16.04 LTS
CPU architecture : x86_64
Linux dbcloud 4.4.0-142-generic #168-Ubuntu SMP Wed Jan 16 21:00:45 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux

Describe the bug
StableDiffusionXLPipeline for the Diffusers backend does not work when the environment variable COMPEL=1 (or not set). Instead, it returns:

{"error":{"code":500,"message":"rpc error: code = Unknown desc = Exception calling application: If `prompt_embeds` are provided, `pooled_prompt_embeds` also have to be passed. Make sure to generate `pooled_prompt_embeds` from the same text encoder that was used to generate `prompt_embeds`.","type":""}}

To Reproduce
Simply follow the Linaqruf/animagine-xl example in the doc (https://localai.io/model-compatibility/diffusers/#model-setup).

Expected behavior

Logs

Additional context
Looks like the problem comes from #904 where Compel is introduced to handle long prompts. The Diffusers API for SDXL requires both prompt_embeds and pooled_prompt_embeds (maybe they'll change that in the future, see huggingface/diffusers#4341), yet the code doesn't cover this scenario.

By the way, it would be better if the .env file contains an option for the environment variable COMPEL. The above issues could really confuse someone new to LocalAI (like me) when they found that the official example does not work🤔

@ljcleo ljcleo added the bug Something isn't working label Aug 30, 2023
@lunamidori5
Copy link
Collaborator

@mudler I am also having this bug is there a fix?

@lianee
Copy link

lianee commented Nov 4, 2023

same error trying to follow the StableDiffusionXLPipeline example

@lunamidori5
Copy link
Collaborator

lunamidori5 commented Nov 5, 2023

same error trying to follow the StableDiffusionXLPipeline example

@lianee Welp, thats a oops, so in your env you need to add COMPEL=0 then it will work, Ill push a update to the how to tonight to make that more clear

@lianee
Copy link

lianee commented Nov 5, 2023

@lunamidori5 I use the docker image, and with -e COMPEL=0 in the command line this works
I had tried this before, but had dumbly put this parameter at the end of the command line. Thanks for making me try again.

@mudler
Copy link
Owner

mudler commented Mar 7, 2024

I've updated usage of Compel within #1746 with XL models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants