Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(weave): load model default properly #3096

Merged
merged 5 commits into from
Nov 27, 2024
Merged

Conversation

jwlee64
Copy link
Contributor

@jwlee64 jwlee64 commented Nov 26, 2024

Description

before the model was not loaded into the settings, when opening a trace in playground
adds a smart default, if the model is not one of the models we support

389687742-2ca90dba-ddf8-401e-bf4a-998e43995b23

@jwlee64 jwlee64 requested review from a team as code owners November 26, 2024 21:56
@circle-job-mirror
Copy link

circle-job-mirror bot commented Nov 26, 2024

Copy link
Member

@gtarpenning gtarpenning left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the smart default might be a bit too magic. Deferring to the team

@@ -121,3 +121,68 @@ export const LLM_MAX_TOKENS = {
};

export type LLMMaxTokensKey = keyof typeof LLM_MAX_TOKENS;

export const LLM_MAX_TOKENS_KEYS: LLMMaxTokensKey[] = Object.keys(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
export const LLM_MAX_TOKENS_KEYS: LLMMaxTokensKey[] = Object.keys(
export const LLM_MAX_TOKENS_KEYS = Object.keys(

) as LLMMaxTokensKey[];

// Helper function to calculate string similarity using Levenshtein distance
const getLevenshteinDistance = (str1: string, str2: string): number => {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we are using js-levenshtein in the app, can you use that?

LLM_MAX_TOKENS_KEYS
);
toast(
`We currently don't support ${inputs.model}, in the playground. We will default to ${closestModel}`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we always want to default to the most similar model? can it ever be a typo? I would expect this case to most likely happen with custom models, which they might not want to have auto-selected.

@jwlee64
Copy link
Contributor Author

jwlee64 commented Nov 27, 2024

@gtarpenning down to remove the smart default to just load the model in if it exist and if not go with the default model

@jwlee64 jwlee64 requested a review from gtarpenning November 27, 2024 19:54
@jwlee64 jwlee64 merged commit 4e84d3a into master Nov 27, 2024
120 checks passed
@jwlee64 jwlee64 deleted the josiah/load-model-in branch November 27, 2024 20:38
@github-actions github-actions bot locked and limited conversation to collaborators Nov 27, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants