Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run it for OpenAI response streaming #177

Open
dev-PankajK opened this issue Jan 2, 2024 · 2 comments
Open

How to run it for OpenAI response streaming #177

dev-PankajK opened this issue Jan 2, 2024 · 2 comments

Comments

@dev-PankajK
Copy link

No description provided.

@OliverThomas2000
Copy link

This is how my setup works. I use a streaming response from FastAPI:

I have this function in my action provider for updating an existing message

const updateLastMessage = (message) => {
    setState((prev) => {
      return { ...prev, messages: [...prev.messages.slice(0, -1), { ...prev.messages.at(-1), message }]};
    });
  };

I then use this inside the action:

    let done, value;
    let messageBuffer = "";
    let decoder = new TextDecoder("utf-8");
    addMessageToState(createChatBotMessage("streaming...")) //You need a dummy message to update
    while (!done) {
      ({ done, value } = await reader.read());
      messageBuffer += decoder.decode(value);
      updateLastMessage(messageBuffer)
    }

Bear in mind that you can also manipulate the delay on createChatBotMessage - it's default is 750ms - if you don't want any loading animation at all you can set this to a negative value (delay:-750). As far as I know this is safe to do.

@michaelsrichter
Copy link

@OliverThomas2000 this was very helpful. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants