You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Could the documentation be updated to clearly outline the steps for setting up local LLMs?
I spent around two hours trying to figure out why there was no "Large Language Model" tab in the Skyrim MOD section for Mantella. Additionally, in the web configuration, it’s not mentioned that you have to manually enter the URL for the service when Mantella starts up to have like OpenAI compatible ollama connected, also I had to add it seems CMD_FLAGS.txt with –extensions openai.
This information seems to be missing (and would be worth noting), and more clarity here would save users significant time and effort!
The text was updated successfully, but these errors were encountered:
Could the documentation be updated to clearly outline the steps for setting up local LLMs?
I spent around two hours trying to figure out why there was no "Large Language Model" tab in the Skyrim MOD section for Mantella. Additionally, in the web configuration, it’s not mentioned that you have to manually enter the URL for the service when Mantella starts up to have like OpenAI compatible ollama connected, also I had to add it seems
CMD_FLAGS.txt
with–extensions openai
.This information seems to be missing (and would be worth noting), and more clarity here would save users significant time and effort!
The text was updated successfully, but these errors were encountered: