-
-
Notifications
You must be signed in to change notification settings - Fork 6.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow using Azure OpenAI #431
Conversation
@herbutant because you opened #325: Could you test the pr for me? I don't have azure openai access. thanks! |
@UmerHA , thanks for this. I made one small change locally to also pull the api_version from ENV because this can also vary (e.g. in my case it is "2023-03-15-preview"). With that it runs like expected. if not os.getenv("AZURE_OPENAI_VERSION"): openai.api_version = os.getenv("AZURE_OPENAI_VERSION") Should an amendment to the README.md with Azure specific instructions also be part of the pr? |
@herbutant agree - have adjusted the readme & made |
openai.api_version = api_version | ||
openai.api_type = "azure" | ||
openai.api_base = os.getenv("AZURE_OPENAI_ENDPOINT") | ||
openai.api_key = os.getenv("AZURE_OPENAI_KEY") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be great to create a separate function handling this to keep it tiy.
response = openai.ChatCompletion.create(model=self.model, **shared_params) | ||
else: | ||
# use Azure OpenAI | ||
response = openai.ChatCompletion.create(engine=self.model, **shared_params) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I realize that if we want to support this we should in fact have a separate object:
The LLM model.
I'd be happy to merge if you find a good way to "abstract away" the LLM object!
I think we need to create a more general abstraction for LLMs if we want to support azure to not make the code messy! Let me know if you have thoughts |
@AntonOsika Agree, I think we have 2 options:
I would vote for using LangChain, as LLM integration is not this project's purpose and LangChain does that well already. I'd be happy to implement the wrapper, as I'm a LC contributor as well. :) If you think option 2 is better, I propose the following abstraction: class AI with methods:
subclasses of AI implement these methods:
Details:
then the OpenAI / AzureOpenAI would look like this:
and
This would work for all kinds of models (chat/non-chat, streaming/non-streaming). Let me know! |
Hey appreciate the thinking on this! Based on discussions in other PRs it looks like we will use langchain abstractions. Because of that and that we have conflicts I’m closing this for now — feel free to open. |
Allows using Azure OpenAI instead of OpenAI (solves #325)
To be consistent with Azure OpenAI terminology, the model is called an
engine
instead.Requires to set
AZURE_OPENAI_ENDPOINT
andAZURE_OPENAI_KEY
as environment variables.