-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Latest ComfyUI Update broke Supir Node Compatability #144
Comments
I just tried with latest ComfyUI and I have no problems, what is the error you are getting? |
I'm getting this as well. Here is the log: `Error occurred when executing SUPIR_conditioner: The shape of the 2D attn_mask is torch.Size([77, 77]), but should be (1, 1). File "C:\AI\ComfyUI\execution.py", line 151, in recursive_execute |
Okay I think I found it, some other custom node must have updated open-clip-torch as with it's latest version I'm getting same error. |
You are amazing. Thank you! I just installed open-clip-torch==2.24.0 like you said and its working again. The following is the nodes I have installed and it seems CCSR is the other node that have open-clip-torch in requirements.txt. 0.0 seconds: C:\AI\ComfyUI\custom_nodes\websocket_image_save.py |
I found what caused it in the open-clip-torch update, and added a work around. Should now work even with the latest version. |
Not working for me, not even importing, tried reinstalling, open-clip-torch is 2.24.0 1.2 seconds (IMPORT FAILED): C:_sd-win\StabilityMatrix-win-x64\Data\Packages\ComfyUI\custom_nodes\ComfyUI-SUPIR |
Different error then, and that doesn't tell me anything, the real error is in the log somewhere. |
here is the complete error, looks like something with flash-attn-cuda. worked fine yesterday Traceback (most recent call last): Cannot import C:_sd-win\StabilityMatrix-win-x64\Data\Packages\ComfyUI\custom_nodes\ComfyUI-SUPIR module for custom nodes: DLL load failed while importing flash_attn_2_cuda: The specified procedure could not be found. |
@sdk401 Are you missing flash attention? Installing flash attention using their recompiled wheels via pip never worked for me on windows. Other's have reported the same: Dao-AILab/flash-attention#595 The only way I had it working was to compile my own. I have compile scripts here: https://github.com/dicksondickson/ComfyUI-Clean-Install |
Only if updating comfyui broke it. I installed it before for Lumina wrapper and it worked fine. Will try again. |
As far as I know, ComfyUI doesn't install Flash Attention or updates Flash Attention. Comfy uses Flash Attention but doesn't require it. Although I could be wrong and my knowledge is outdated. |
I don't really know what's causing this, but as it seems to be related to kornia, I took a closer look at where it's used...... and realized it's not necessary. So I've removed it's import totally, if it doesn't clear the error, at very least the error should change now if you update and run. |
Updated, error gone, nodes load, thanks! |
Update fixed the issue. Thanks! |
I just updated comfui to the latest update and none of my supir node workflows work. Uninstalling, reinstalling etc does not work.
The text was updated successfully, but these errors were encountered: