Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Error when upgrading lobechat from v1.32.0 to v1.32.1 using model o1 preview #5018

Open
abix5 opened this issue Dec 14, 2024 · 4 comments
Labels
🐛 Bug Something isn't working | 缺陷 Model Provider 模型服务商

Comments

@abix5
Copy link

abix5 commented Dec 14, 2024

📦 Environment

Docker

📌 Version

v1.32.1

💻 Operating System

macOS

🌐 Browser

Chrome

🐛 Bug Description

After upgrading lobechat from version v1.32.0 to v1.32.1, I encountered the following error when using the model o1:

{
  "error": {
    "code": 13,
    "details": [
      {
        "@type": "type.googleapis.com/openaiproxy.v1.ErrorResponse",
        "error": {
          "code": "",
          "type": "custom",
          "param": "",
          "message": "openai.(*LLMProvider).ChatCompletionStream.func1.1: try error with instance oai:1lWvbo - this model has beta-limitations, streaming not supported",
          "critical_data_details": [],
          "limit_reached_details": null
        }
      },
      {
        "@type": "type.googleapis.com/openaiproxy.v1.StatusCode",
        "value": 500
      }
    ],
    "message": "openaiproxy.(*ServiceImplementation).processChatCompletionStreamRequest: can't do chat completion stream - proxy.(*UseCase).processChatCompletionStreamResponse: error stream completion - openai.(*LLMProvider).ChatCompletionStream.func1: can't create - code: <nil>, message: openai.(*LLMProvider).ChatCompletionStream.func1.1: try error with instance oai:1lWvbo - this model has beta-limitations, streaming not supported, param: <nil>, type: custom"
  },
  "endpoint": "https://*****/public/v1/",
  "provider": "openai"
}

It appears that the error indicates streaming is not supported for this model due to beta limitations. This issue did not occur in version v1.32.0.

Steps to Reproduce:

Upgrade lobechat from version v1.32.0 to v1.32.1.
Use the model o1 for chat completion.
Observe the error message.

📷 Recurrence Steps

No response

🚦 Expected Behavior

No response

📝 Additional Information

No response

@abix5 abix5 added the 🐛 Bug Something isn't working | 缺陷 label Dec 14, 2024
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Dec 14, 2024
@lobehubbot
Copy link
Member

👀 @abix5

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@abix5 abix5 changed the title [Bug] Error when upgrading lobechat from v1.32.0 to v1.32.1 using model o1 [Bug] Error when upgrading lobechat from v1.32.0 to v1.32.1 using model o1 preview Dec 14, 2024
@dosubot dosubot bot added the Model Provider 模型服务商 label Dec 14, 2024
Copy link

dosubot bot commented Dec 14, 2024

Hey @abix5! I'm here to help you with any bugs, questions, or contributions you have regarding lobechat. Let's tackle this issue together!

The error you're encountering after upgrading to version v1.32.1 is related to the model o1's beta limitations, specifically regarding streaming support. In version v1.32.1, support for "o1 models using streaming" was added, which might have introduced changes affecting how streaming is handled for this model [1].

The error message indicates that streaming is not supported for the o1 model due to its beta limitations. This suggests that while the framework now supports streaming for o1 models, there might be specific conditions or configurations under which streaming is not yet fully operational.

To address this issue, you might want to check if there are any configuration settings or flags that control model streaming capabilities in Lobe Chat, although specific documentation on this wasn't found in the repository. Additionally, reviewing any recent changes in the codebase related to streaming support for o1 models could provide further insights into resolving this error.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@arvinxx
Copy link
Contributor

arvinxx commented Dec 14, 2024

openai o1 has support stream, so maybe you need to use OpenAI offical API

@abix5
Copy link
Author

abix5 commented Dec 14, 2024

I use the official OpenAI API, yes, o1 supports stream, but at the moment not all corporate accounts have received o1. Many still don't have access and are forced to use o1 preview, where stream support is unavailable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 Model Provider 模型服务商
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

3 participants