-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
o1-mini models broken in 0.8.63 #3407
Comments
I am seeing this as well for o1-preview |
Possibly related to #3388; this one uses OpenAI directly, vs. Azure OpenAI. |
Still broken in 0.8.65 |
Hi folks, thanks for the notice here. Tracking updates to this issue, and other related issues with Azure after the 0.8.63 release, in this thread: #3477 |
@Patrick-Erichsen I think this is a little bit different as we are using the OpenAI provider to call a proxied API with a standard openai-compatible API while those issues all use the Azure Provider. |
Ah, yes re-reading your original issue description that is clearer. I'm going to keep this lumped in with the other issues just because the o1 updates were made around the same time, and to make sure we circle back. But thanks for the clarification 👍 |
Thank you @sestinj |
Before submitting your bug report
Relevant environment info
Description
I upgraded to v0.8.63 this morning and my o1-mini completions started not to return any output. I was working with it fine prior to restarting the extension to install the update.
I receive no error message, I see nothing in the developer tools console, continue.log is empty. This model is an Azure OpenAI model proxied internally (to conform to standard OpenAI model schemas). Via the proxy, we can see that
stream: true
seems to always be sent regardless of the completionOptions in config.json.Looking at the release commits, I'm guessing this is either part of the config json -> yaml conversion or part of the o1 "fixes".
To reproduce
Log output
The text was updated successfully, but these errors were encountered: