Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix empty response in the conversation #967

Merged
merged 1 commit into from
May 28, 2024

Conversation

starkgate
Copy link
Contributor

  • What kind of change does this PR introduce? (Bug fix, feature, docs update, ...)

When asking a question, the LLM's response streams in as it should, then disappears from the conversation once it is finished. It reappears if the page is refreshed.

The bug seems to occur when the token sent by the LLM is an empty string. In llama-cpp, the last token seems to always be an empty string. This causes the frontend to recognize the string as empty in conversationSlice.ts#L151, which in turn causes the code to enter the else part of the statement, which returns an empty response.

  • Why was this change needed? (You can also link to an open issue here)

Fixes #944

  • Other information:

Copy link

vercel bot commented May 28, 2024

@starkgate is attempting to deploy a commit to the Arc53 Team on Vercel.

A member of the Team first needs to authorize it.

Copy link

vercel bot commented May 28, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
docs-gpt ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 28, 2024 9:14am

Copy link
Contributor

@dartpain dartpain left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@dartpain dartpain merged commit 967b195 into arc53:main May 28, 2024
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

🐛 Bug Report: Empty response in the conversation after finishing streaming
2 participants