Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: use type key rather than response for llmMesage #578

Merged
merged 1 commit into from
Mar 6, 2025

Conversation

codeincontext
Copy link
Collaborator

@codeincontext codeincontext commented Mar 6, 2025

In a local branch I accidentally disabled structured outputs and the chat stopped working. I found that the LLM was returning a message in this format: "response":"llmMessage",..., when it should be "type":"llmMessage",...

This is because the example in the prompt is different from the structure required by the structured output scheme

Description

  • Change response example in prompt to use "type" key for the part type
  • Unescape the quotes in the example. It doesn't look like the other examples are escaped

Copy link

vercel bot commented Mar 6, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
oak-ai-lesson-assistant ✅ Ready (Inspect) Visit Preview 💬 Add feedback Mar 6, 2025 9:04am

Copy link

sonarqubecloud bot commented Mar 6, 2025

Copy link

github-actions bot commented Mar 6, 2025

Playwright test results

passed  17 passed
skipped  2 skipped

Details

report  Open report ↗︎
stats  19 tests across 16 suites
duration  2 minutes, 38 seconds
commit  dee66bc

Skipped tests

No persona › tests/auth.test.ts › authenticate through Clerk UI
No persona › tests/chat-performance.test.ts › Component renders during lesson chat › There are no unnecessary rerenders across left and right side of chat

@codeincontext codeincontext requested a review from stefl March 6, 2025 15:31
@codeincontext codeincontext marked this pull request as ready for review March 6, 2025 15:31
@codeincontext codeincontext merged commit 1cce025 into main Mar 6, 2025
20 checks passed
@codeincontext codeincontext deleted the fix/llm-message-format branch March 6, 2025 16:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants