-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please help #16
Comments
Hi. Please check if the |
Thank you very much for answering me. Where can I get that Checkpoint? I'm a little lost, sorry |
Hi, here are the checkpoints. Thanks. |
I don't know where I should put the files or how I should do it :( |
Just merge the downloaded folder with your /models folder at ComfyUI. Thanks. |
Yes. You are correct ;) |
the same way |
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json Cannot import E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill module for custom nodes: No module named 'comfyui_controlnet_aux' Import times for custom nodes: Okay I disable the other custom nodes to see it crearly, this is the log now |
[START] Security scan ComfyUI-Manager: installing dependencies done.** ComfyUI startup time: 2024-12-16 00:55:18.115834 Prestartup times for custom nodes: Total VRAM 4096 MB, total RAM 32666 MB Loading: ComfyUI-Manager (V2.55.4)ComfyUI Revision: 2890 [9a616b81] *DETACHED | Released on '2024-12-04'[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json Cannot import E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill module for custom nodes: You can't pass Sorry, so silly, activating the controlnet custom node, the same errors :( |
Check this issues7. |
Cannot import E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill module for custom nodes: E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\llava-v1.5-7b-finetune-clean does not appear to have a file named config.json. Checkout 'https://huggingface.co/E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\llava-v1.5-7b-finetune-clean/None' for available files. Is there a problem with how the custom node is being downloaded in the first place? Am I overlooking something? |
Traceback (most recent call last):
File "D:\CondaENV\Trelliz\lib\site-packages\transformers\utils\hub.py", line 385, in cached_file
resolved_file = hf_hub_download(
File "D:\CondaENV\Trelliz\lib\site-packages\huggingface_hub\utils_validators.py", line 106, in inner_fn
validate_repo_id(arg_value)
File "D:\CondaENV\Trelliz\lib\site-packages\huggingface_hub\utils_validators.py", line 160, in validate_repo_id
raise HFValidationError(
huggingface_hub.errors.HFValidationError: Repo id must use alphanumeric chars or '-', '', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: 'E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\llava-v1.5-7b-finetune-clean'.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\nodes.py", line 2037, in load_custom_node
module_spec.loader.exec_module(module)
File "", line 883, in exec_module
File "", line 241, in call_with_frames_removed
File "E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill_init.py", line 20, in
from .magic_quill import MagicQuill
File "E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill\magic_quill.py", line 104, in
class MagicQuill(object):
File "E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill\magic_quill.py", line 106, in MagicQuill
llavaModel = LLaVAModel()
File "E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill\llava_new.py", line 29, in init
self.tokenizer, self.model, self.image_processor, self.context_len = load_pretrained_model(
File "E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill\LLaVA\llava\model\builder.py", line 121, in load_pretrained_model
tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False)
File "D:\CondaENV\Trelliz\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 758, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
File "D:\CondaENV\Trelliz\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 590, in get_tokenizer_config
resolved_config_file = cached_file(
File "D:\CondaENV\Trelliz\lib\site-packages\transformers\utils\hub.py", line 450, in cached_file
raise EnvironmentError(
OSError: Incorrect path_or_model_id: 'E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\llava-v1.5-7b-finetune-clean'. Please provide either the path to a local folder or the repo_id of a model on the Hub.
Cannot import E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_MagicQuill module for custom nodes: Incorrect path_or_model_id: 'E:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\llava-v1.5-7b-finetune-clean'. Please provide either the path to a local folder or the repo_id of a model on the Hub.
The text was updated successfully, but these errors were encountered: