-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: No module named 'llama_index.core.llms.generic_utils' #11071
Comments
🤖 Hey @MudassirAqeelAhmed! Good to see you back here. Hope you're doing well. Based on the error message you provided, it seems like the module 'llama_index.core.llms.generic_utils' is not found in your Python environment. This could be due to a few reasons:
!pip uninstall llama-index
!pip install llama-index
However, without more information about the LlamaIndex package and its dependencies, it's hard to provide a definitive solution. Please check the package documentation or contact the package maintainers for more information. In the meantime, I found some similar issues in the LlamaIndex repository that might be helpful:
I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
|
When I tried this, issue has been fixed in llama-index-core but the code hasn't changed in the actual package itself when I checked the filepath. Might require updating of pyproject.toml but it's gonna be a pain. You can try running it as a py script and manually changing for now |
@MudassirAqeelAhmed also update the openai embeddings -- |
|
pip uninstall llama-index # remove any global
python -m venv venv
source venv/bin/activate
pip install llama-index llama-index llama-index-embeddings-openai @MudassirAqeelAhmed can you try this? |
I'm running this on notebook. |
pip uninstall llama-index # remove any global
python -m venv venv
source venv/bin/activate
pip install llama-index llama-index llama-index-embeddings-openai
pip install ipykernel
python -m ipykernel install --user --name=my_venv --display-name="my_venv" and then select my_venv kernal in your notebook |
@MudassirAqeelAhmed I had this same problem and it was solved by: |
I managed to solve this problem by copying generic_utils.py from llama_index/core/base/llms into llama_index/core/llms |
Or you could have installed in a fresh env /upgraded your deps 😀 👍🏻 |
For some reason it didn't work for me 🤷 |
Thanks you! This worked for me as well! No other methods mentioned worked! Thank you! |
I had the exact same issue, this worked for me as well, thank you! |
Pretty much every integration I try to use with LLamaIndex ends up being problematic 😭 |
@ycd probably more productive to share the issues and get a resolution :) If you are migrating from v0.9.x it's really recommended to start with a fresh venv |
@logan-markewich to be fair I was just yapping, solved it already but to be more specific and add a little bit context; I encountered like 3 integrations/projects that has obsolete documentation that does not 'just works', I even ended up monkey-patching some of them to get it working(not this one). EDIT: I don't remember the exact issue but some of the out of date docs were external projects that provides an integration with LLamaIndex (e.g Chainlit) |
Okay, I was using from llama_index.llms.anthropic.base import Anthropic
File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/llms/anthropic/base.py", line 21, in <module>
from llama_index.core.base.llms.generic_utils import (
ModuleNotFoundError: No module named 'llama_index.core.base.llms.generic_utils' This time, I decided give your suggestion a try, bombed my env, created a new one, and there it is, my env is conflicting now. The conflict is caused by:
The user requested llama-index-core==0.10.3
llama-index 0.10.5 depends on llama-index-core<0.11.0 and >=0.10.0
llama-index-agent-openai 0.1.1 depends on llama-index-core<0.11.0 and >=0.10.1
llama-index-callbacks-langfuse 0.1.2 depends on llama-index-core<0.11.0 and >=0.10.8 Okay, great, let's upgrade to from llama_index.core.prompts.base import (
File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/core/prompts/base.py", line 37, in <module>
from llama_index.core.llms.base import BaseLLM
File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/core/llms/__init__.py", line 12, in <module>
from llama_index.core.llms.custom import CustomLLM
File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/core/llms/custom.py", line 19, in <module>
from llama_index.core.llms.llm import LLM
File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/core/llms/llm.py", line 43, in <module>
from llama_index.core.prompts import BasePromptTemplate, PromptTemplate
ImportError: cannot import name 'BasePromptTemplate' from partially initialized module 'llama_index.core.prompts' (most likely due to a circular import) Then, I got circular import error, found your #11032, then decided to upgrade ERROR: Cannot install -r requirements.txt (line 70) and llama-index-agent-openai==0.1.1 because these package versions have conflicting dependencies.
The conflict is caused by:
The user requested llama-index-agent-openai==0.1.1
llama-index 0.10.15 depends on llama-index-agent-openai<0.2.0 and >=0.1.4 It's fine I can do that, which I resolved all conflicts, then even in 0.10.15, it still hits me up with this: from llama_index.llms.gemini.base import Gemini
File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/llms/gemini/base.py", line 20, in <module>
from llama_index.core.utilities.gemini_utils import (
ModuleNotFoundError: No module named 'llama_index.core.utilities.gemini_utils' |
@ycd what are your project reqs? It seems like at this point I would just start a fresh venv with latest versions of things And yea, sadly we can't control docs from people that use the llama-index package 😅 |
Bug Description
I installed llama-index on google colab notebook.
!pip install llama-index-embeddings-anyscale
!pip install -U llama-index llama-index-core llama-index-llms-openai
I'm trying to import
from llama_index.embeddings.anyscale import AnyscaleEmbedding
I'm getting this error
ModuleNotFoundError: No module named 'llama_index.core.llms.generic_utils'
Version
llama-index==0.10.10
Steps to Reproduce
Just install llama-index, and llama-index-embeddings-anyscale on colab and import
from llama_index.embeddings.anyscale import AnyscaleEmbedding
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered: