Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve Documentation for LocalLLM (e.g. ollama) and wrong entries in documentation. #423

Open
chrisoutwright opened this issue Oct 27, 2024 · 0 comments

Comments

@chrisoutwright
Copy link

Could the documentation be updated to clearly outline the steps for setting up local LLMs?

I spent around two hours trying to figure out why there was no "Large Language Model" tab in the Skyrim MOD section for Mantella. Additionally, in the web configuration, it’s not mentioned that you have to manually enter the URL for the service when Mantella starts up to have like OpenAI compatible ollama connected, also I had to add it seems CMD_FLAGS.txt with –extensions openai.

This information seems to be missing (and would be worth noting), and more clarity here would save users significant time and effort!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant