-
Notifications
You must be signed in to change notification settings - Fork 364
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Support InternLM Deploy #168
Labels
Comments
Is InternLM supported by llama.cpp? If it is then we probably already support it! |
martindevans
added
Upstream
Tracking an issue in llama.cpp
enhancement
New feature or request
labels
Nov 8, 2023
ggerganov/llama.cpp#5184 has been merged (4 days ago), therefore #479 should include InternLM support for LLamaSharp. |
0.10.0 has just been released which should include InternLM support at long last! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Dear LLamaSharp developer,
Greetings! I am vansinhu, a community developer and volunteer at InternLM. InternLM is a large language model similar to llama2, and we look forward to InternLM being supported in LLamaSharp. If there are any challenges or inquiries regarding support for InternLM, please feel free to join our Discord discussion at https://discord.gg/gF9ezcmtM3.
Best regards,
vansinhu
The text was updated successfully, but these errors were encountered: