Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: ModuleNotFoundError: No module named 'llama_index.readers.schema' #11937

Closed
Ethereal-sakura opened this issue Mar 14, 2024 · 5 comments
Closed
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@Ethereal-sakura
Copy link

Bug Description

When I run the code

from llama_hub.tools.tavily_research import pip listToolSpec
from llama_index.agent import OpenAIAgent
import os
BASE_URL = " https://chat.***.****/v1"                
API_SECRET_KEY="sk-1NA1mSaHKprafQ7r27Ca37A74eBc47868c9eEe8b0308112b" 
os.environ["OPENAI_API_KEY"]  =  API_SECRET_KEY
os.environ["OPENAI_API_BASE"] =  BASE_URL
tavily_tool = TavilyToolSpec(
    api_key='tvly-dHbDutyoi5iyCOWopGTb*******',
)
agent = OpenAIAgent.from_tools(tavily_tool.to_tool_list())
agent.chat('Please introduce the city of shanghai')

the mistake

Traceback (most recent call last):
  File "e:\langchian_learning\Tavily\search.py", line 1, in <module>
    from llama_hub.tools.tavily_research import TavilyToolSpec
  File "D:\anaconda\envs\LLAMA\lib\site-packages\llama_hub\tools\tavily_research\__init__.py", line 2, in <module>
    from llama_hub.tools.tavily_research.base import (
  File "D:\anaconda\envs\LLAMA\lib\site-packages\llama_hub\tools\tavily_research\base.py", line 4, in <module>    
    from llama_index.readers.schema.base import Document
ModuleNotFoundError: No module named 'llama_index.readers.schema'

And the same mistake on colab
image

Version

0.10.19

Steps to Reproduce

llama-index                              0.10.19
llama-index-agent-openai                 0.1.5
llama-index-cli                          0.1.7
llama-index-core                         0.10.19
llama-index-embeddings-huggingface       0.1.4
llama-index-embeddings-openai            0.1.6
llama-index-indices-managed-llama-cloud  0.1.3
llama-index-legacy                       0.9.48
llama-index-llms-openai                  0.1.7
llama-index-llms-replicate               0.1.3
llama-index-multi-modal-llms-openai      0.1.4
llama-index-program-openai               0.1.4
llama-index-question-gen-openai          0.1.3
llama-index-readers-file                 0.1.7
llama-index-readers-llama-parse          0.1.3
llama-index-vector-stores-chroma         0.1.5

Run the code

from llama_hub.tools.tavily_research import TavilyToolSpec
from llama_index.agent import OpenAIAgent
import os
BASE_URL = " https://chat.****.****/v1"                   # 设置代理
API_SECRET_KEY="sk-1NA1mSaHKprafQ7r27Ca37A74eBc4786******" # 设置api-key
os.environ["OPENAI_API_KEY"]  =  API_SECRET_KEY
os.environ["OPENAI_API_BASE"] =  BASE_URL
tavily_tool = TavilyToolSpec(
    api_key='tvly-dHbDutyoi5iyCOWopGTb******',
)
agent = OpenAIAgent.from_tools(tavily_tool.to_tool_list())

agent.chat('Please introduce the city of shanghai')

Relevant Logs/Tracbacks

No response

@Ethereal-sakura Ethereal-sakura added bug Something isn't working triage Issue needs to be triaged/prioritized labels Mar 14, 2024
Copy link

dosubot bot commented Mar 14, 2024

Hey there, @HelloEveryonehh! Great to see you back. Hope you're doing well and ready to dive into another exciting puzzle together. 😊

Thank you for reporting this issue. The error you're encountering, ModuleNotFoundError: No module named 'llama_index.readers.schema', suggests that the Python interpreter cannot find a module named schema within a package or sub-package llama_index.readers.

Based on the information from similar issues in the LlamaIndex repository, it seems that the import paths have changed in the transition from version 0.9.x to 0.10.19 of LlamaIndex. The llama_index.readers.schema module might have been moved or renamed in the newer version.

Here are some steps you can try to resolve the issue:

  1. Update the import statement to:
from llama_index.core import Document

This was suggested in a similar issue #11138.

  1. If the issue persists, you can try creating a fresh virtual environment and reinstalling the necessary packages:
python -m venv venv
source venv/bin/activate
pip install llama-index llama-index-tools-code-interpreter

This was suggested in another similar issue #11011.

  1. Note that llama-hub is deprecated with v0.10.x of llama-index. If you're using llama-hub, you might need to update your code to use the new llama-index package.

  2. If the problem continues, try starting with a fresh virtual environment as suggested in the seventh comment with 0 positive and 0 negative reactions.

  3. In a new terminal, uninstall any globally installed llama-index, create a new virtual environment, and install llama-index within it:

pip uninstall llama-index  # just in case it was installed globally 
python -m venv venv
source venv/bin/activate
pip install llama-index

This was suggested in the eighth comment with 0 positive and 0 negative reactions.

Please try these steps and let us know if the issue persists. If it does, please provide more information about your setup, including how you installed llama_index, the output of pip show llama_index, your Python version, and your operating system. This will help us better understand the issue and provide a more accurate solution.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@anoopshrma
Copy link
Collaborator

Hey! You have posted your openai key. I would highly suggest you remove it!

@anoopshrma
Copy link
Collaborator

Llamahub is deprecated. All the loaders/tools have been converted into separate pypi packages.

You'll need to install tavily with pip install llama-index-tools-tavily-research
Then you can import it with
from llama_index.tools.tavily_reasearch import TavilyToolSpec

@Ethereal-sakura
Copy link
Author

Llamahub is deprecated. All the loaders/tools have been converted into separate pypi packages.

You'll need to install tavily with pip install llama-index-tools-tavily-research Then you can import it with from llama_index.tools.tavily_reasearch import TavilyToolSpec

Thank you for your answer

@JYC0413
Copy link

JYC0413 commented Aug 16, 2024

Llamahub is deprecated. All the loaders/tools have been converted into separate pypi packages.

You'll need to install tavily with pip install llama-index-tools-tavily-research Then you can import it with from llama_index.tools.tavily_research import TavilyToolSpec

Actually it doesn't matter, but I'm going to correct you, it's spelled wrong
from llama_index.tools.tavily_reasearch import TavilyToolSpec

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

3 participants