Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix & Feat Chats Summarizing #3433

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 13 additions & 5 deletions app/store/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -80,9 +80,10 @@ function createEmptySession(): ChatSession {
};
}

function getSummarizeModel(currentModel: string) {
// if it is using gpt-* models, force to use 3.5 to summarize
return currentModel.startsWith("gpt") ? SUMMARIZE_MODEL : currentModel;
// fix known issue where summarize is not using the current model selected
function getSummarizeModel(currentModel: string, modelConfig: ModelConfig) {
// should be depends of user selected
return currentModel ? modelConfig.model : currentModel;
Comment on lines +84 to +86
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor suggestion for clarity in getSummarizeModel function.

The function's logic could be clearer by explicitly checking for null or undefined rather than using a falsy check, which includes 0, "", false, etc., that are unlikely to be valid model names but could potentially lead to unexpected behavior.

-  return currentModel ? modelConfig.model : currentModel;
+  return currentModel != null ? currentModel : modelConfig.model;
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
function getSummarizeModel(currentModel: string, modelConfig: ModelConfig) {
// should be depends of user selected
return currentModel ? modelConfig.model : currentModel;
function getSummarizeModel(currentModel: string, modelConfig: ModelConfig) {
// should be depends of user selected
return currentModel != null ? currentModel : modelConfig.model;

}

function countMessages(msgs: ChatMessage[]) {
Expand Down Expand Up @@ -490,10 +491,14 @@ export const useChatStore = createPersistStore(
content: Locale.Store.Prompt.Topic,
}),
);
// this summarizing method should be depends of user selected
const sessionModelConfig = this.currentSession().mask.modelConfig;
const topicModel = getSummarizeModel(session.mask.modelConfig.model, sessionModelConfig);

api.llm.chat({
messages: topicMessages,
config: {
model: getSummarizeModel(session.mask.modelConfig.model),
model: topicModel,
},
onFinish(message) {
get().updateCurrentSession(
Expand Down Expand Up @@ -539,6 +544,9 @@ export const useChatStore = createPersistStore(
historyMsgLength > modelConfig.compressMessageLengthThreshold &&
modelConfig.sendMemory
) {
// this summarizing method should be depends of user selected
const sessionModelConfig = this.currentSession().mask.modelConfig;
const summarizeModel = getSummarizeModel(session.mask.modelConfig.model, sessionModelConfig);
api.llm.chat({
messages: toBeSummarizedMsgs.concat(
createMessage({
Expand All @@ -550,7 +558,7 @@ export const useChatStore = createPersistStore(
config: {
...modelConfig,
stream: true,
model: getSummarizeModel(session.mask.modelConfig.model),
model: summarizeModel,
},
onUpdate(message) {
session.memoryPrompt = message;
Expand Down