-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore(weave): load model default properly #3096
Conversation
Preview this PR with FeatureBee: https://beta.wandb.ai/?betaVersion=b483c14f1538019bdf14dbaa88cee8aed681d17b |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the smart default might be a bit too magic. Deferring to the team
@@ -121,3 +121,68 @@ export const LLM_MAX_TOKENS = { | |||
}; | |||
|
|||
export type LLMMaxTokensKey = keyof typeof LLM_MAX_TOKENS; | |||
|
|||
export const LLM_MAX_TOKENS_KEYS: LLMMaxTokensKey[] = Object.keys( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
export const LLM_MAX_TOKENS_KEYS: LLMMaxTokensKey[] = Object.keys( | |
export const LLM_MAX_TOKENS_KEYS = Object.keys( |
) as LLMMaxTokensKey[]; | ||
|
||
// Helper function to calculate string similarity using Levenshtein distance | ||
const getLevenshteinDistance = (str1: string, str2: string): number => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we are using js-levenshtein
in the app, can you use that?
LLM_MAX_TOKENS_KEYS | ||
); | ||
toast( | ||
`We currently don't support ${inputs.model}, in the playground. We will default to ${closestModel}` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we always want to default to the most similar model? can it ever be a typo? I would expect this case to most likely happen with custom models, which they might not want to have auto-selected.
@gtarpenning down to remove the smart default to just load the model in if it exist and if not go with the default model |
Description
before the model was not loaded into the settings, when opening a trace in playground
adds a smart default, if the model is not one of the models we support