-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
💡 idea: Soften the Enterprise requirement for multiple LLM backends (Add AI Service) #169
Comments
Hey @AnnoyingTechnology, thanks for the feedback. Can you help us understand your use case for multi LLM support? Also curious in general how your team is using Mattermost and how large your instance is in terms of user count? We've heard the overwhelming majority of small teams are using just one LLM model, while larger instances and enterprises may need multi-LLM support to run various custom and tuned models. Also, please check out the design preview of some work we're doing to enhance multi-LLM support even further: #69 (comment). We'd look forward to hearing your feedback on that thread regarding the designs and functionality. |
Hi, Just to answer on the use case :
Most requests
Some requests
Hence the two LLM endpoints. That multi-bot feature would be ideal. |
@esethna I'm bumping this issue from April. Any chance of reconsideration ? Currently we would very much like being able to toggle between 4o and Qwen 2.5 32B on-prem. |
Thanks for the feedback. cc// @BillAnderson304 on the above. I don't believe we have any plans to remove the Enterprise requirement at this time for Copilot Multi-LLM setups, however one things we've heard in the past is people forking the repo and compiling two separate plugins with different bot names and configurations. |
Description
Can the requirement for an Enterprise licence to use multiple LLMs endpoints be challenged ?
If so, maybe allowing 2 to 3 endpoints on community version, and then require Enterprise version would be a nice gesture.
The text was updated successfully, but these errors were encountered: