Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ai] Global on/off switch for LLM usage #44

Open
2 tasks
mmacy opened this issue Jan 28, 2024 · 0 comments
Open
2 tasks

[ai] Global on/off switch for LLM usage #44

mmacy opened this issue Jan 28, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@mmacy
Copy link
Collaborator

mmacy commented Jan 28, 2024

  • Simplified enable/disable option for LLM usage

    You should be able to flip a USE_AI or somesuch bit without the need to modify how you're calling into osrlib. Depending on the enabled/disabled flag, the code path would just either call the OpenAI API to get strings or use those provided by the developer/adventure designer.

  • Document how to enable/disable said LLM switch in the top-level README as well as the osrlib README.

    The top-level README currently makes it sound like the easy switchyswitchy is already in, and while you can enable/disable it right now, it's not clear how (and it's not clean as described above).

@mmacy mmacy added the enhancement New feature or request label Jan 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant