Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: Agent model-context-protocol support #17626

Open
dmi3coder opened this issue Jan 25, 2025 · 1 comment
Open

[Feature Request]: Agent model-context-protocol support #17626

dmi3coder opened this issue Jan 25, 2025 · 1 comment
Labels
enhancement New feature or request triage Issue needs to be triaged/prioritized

Comments

@dmi3coder
Copy link
Contributor

Feature Description

MCP is an open protocol that standardizes how applications provide context to LLMs. Would be good to make llama_index support MCP integration(e.g for agents)

https://www.anthropic.com/news/model-context-protocol
https://modelcontextprotocol.io/
https://github.com/punkpeye/awesome-mcp-servers?tab=readme-ov-file

Reason

instead of manually defining tools that agents can use, we could make MCP provider that will be linked to locally installed MCPs

Value of Feature

Better integration with latest tools, extension of coverage of support.

@dmi3coder dmi3coder added enhancement New feature or request triage Issue needs to be triaged/prioritized labels Jan 25, 2025
@logan-markewich
Copy link
Collaborator

logan-markewich commented Jan 25, 2025

If you can show me an example where one script/process launches an mcp server, and another script/process uses a client to interact with that server, I'll make the integration

Spent a few hours when MCP launched and couldn't figure it out . Their docs had the server and client launching in the same script...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

2 participants