Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: Redis Chat Store with ChatSummaryMemoryBuffer guide #13308

Closed
1 task done
risos8200 opened this issue May 7, 2024 · 13 comments · Fixed by #13325
Closed
1 task done

[Question]: Redis Chat Store with ChatSummaryMemoryBuffer guide #13308

risos8200 opened this issue May 7, 2024 · 13 comments · Fixed by #13325
Labels
question Further information is requested

Comments

@risos8200
Copy link

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

Is the redis chat store still supported?

I am trying to have a session managed based memory history, some another question recommended using chat store with keys, so I wanted to use Redis Chat Store to do so. Anyone has guide or test examples for using redischatstore with the new ChatSummaryMemoryBuffer?

@risos8200 risos8200 added the question Further information is requested label May 7, 2024
Copy link

dosubot bot commented May 7, 2024

Yes, the RedisChatStore is indeed supported and can be effectively used with the ChatSummaryMemoryBuffer for managing session-based memory history. Here's how you can integrate them:

  1. Initialize RedisChatStore with your Redis server details:

    from llama_index.storage.chat_store.redis import RedisChatStore
    redis_chat_store = RedisChatStore(redis_url="redis://localhost:6379", ttl=300)
  2. Set up the summarizer LLM (e.g., using OpenAI's GPT model) and a tokenizer function:

    from llama_index.llms.openai import OpenAI as OpenAiLlm
    import tiktoken
    model = "gpt-4-0125-preview"  # Example model
    summarizer_llm = OpenAiLlm(model_name=model, max_tokens=256)
    tokenizer_fn = tiktoken.encoding_for_model(model).encode
  3. Initialize ChatSummaryMemoryBuffer with the RedisChatStore and other parameters:

    from llama_index.core.memory.chat_summary_memory_buffer import ChatSummaryMemoryBuffer
    chat_summary_memory_buffer = ChatSummaryMemoryBuffer.from_defaults(
        chat_store=redis_chat_store,
        chat_store_key="user1_session_key",
        summarizer_llm=summarizer_llm,
        token_limit_full_text=256,
        tokenizer_fn=tokenizer_fn,
    )
  4. Manage your chat history by adding new messages and retrieving the chat history as needed:

    from llama_index.core.llms import ChatMessage, MessageRole
    new_message = ChatMessage(role=MessageRole.USER, content="Hello, how can I help you?")
    chat_summary_memory_buffer.put(new_message)
    chat_history = chat_summary_memory_buffer.get()

This setup allows you to manage long chat histories efficiently by summarizing older messages and keeping the most relevant information within a specified token limit. For detailed examples and tests, including handling token limits and summarizing long messages, refer to the test_chat_summary_memory_buffer.py in the LlamaIndex documentation.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@logan-markewich
Copy link
Collaborator

It's definitely still supported 😅 at first glance the not response looks right. I can check further in a bit

@risos8200
Copy link
Author

Thanks a lot, @logan-markewich , please let me know!

And just to confirm I can pass chat_summary_memory_buffer to the agent.chat directly directly for dynamic updation of user conversation instead of the manual chat_summary_memory_buffer.put

@risos8200
Copy link
Author

risos8200 commented May 7, 2024

pydantic.v1.error_wrappers.ValidationError: 1 validation error for ChatSummaryMemoryBuffer
chat_store
Can't instantiate abstract class BaseChatStore with abstract methods add_message, delete_last_message, delete_message, delete_messages, get_keys, get_messages, set_messages (type=type_error)

Getting this error when building with redis client

redis_client= redis.Redis(host="myurl", port=myport,
    username="default", 
    password=os.getenv("mypassword")) 
chat_store=RedisChatStore(redis_client=redis_client)

memory = ChatSummaryMemoryBuffer.from_defaults(
chat_store=chat_store,
chat_store_key=user_key,
token_limit=512,
)

agent = ReActAgent.from_tools(my_tool_list, llm=llm, memory=memory, verbose=True)

@logan-markewich
Copy link
Collaborator

This worked for me:

from llama_index.core.memory import ChatSummaryMemoryBuffer
from llama_index.storage.chat_store.redis import RedisChatStore
from llama_index.core import Document, VectorStoreIndex

index = VectorStoreIndex.from_documents([Document.example()])

chat_store = RedisChatStore(redis_url="redis://localhost:6379")
memory = ChatSummaryMemoryBuffer.from_defaults(
    token_limit=5000,
    chat_store=chat_store,
    chat_store_key="user123"
)

chat_engine = index.as_chat_engine(
    chat_mode="condense_plus_context",
    memory=memory
)

response = chat_engine.chat("hello!")
print(memory.get_all())

chat_store = RedisChatStore(redis_url="redis://localhost:6379")
memory = ChatSummaryMemoryBuffer.from_defaults(
    token_limit=5000,
    chat_store=chat_store,
    chat_store_key="user123"
)
print(memory.get_all())

Although I noticed a bug in from_defaults for the summary buffer where it always erases the previous chat history. Will fix that

@risos8200
Copy link
Author

I am getting ModuleNotFoundError: No module named 'llama_index.storage', am I missing something here?

@risos8200
Copy link
Author

Never mind, found it in #12364

@risos8200
Copy link
Author

@logan-markewich seem like when you call the chat_engine, that's when the redis data gets pushed

chat_engine = index.as_chat_engine( chat_mode="condense_plus_context", memory=memory )

How do we do that when we are using agent.chat? I tried a couple of time even added this chat_mode in agent.chat but redis doesn't seem to get updated.

@logan-markewich
Copy link
Collaborator

@risos8200 that's not when it gets pushed? It gets pushed on every .chat() call?

@logan-markewich
Copy link
Collaborator

It seems to work fine for me, other than that bug I mentioned above

@risos8200
Copy link
Author

Base on my analysis of the code, there is self._memory.put(ChatMessage(content=message, role=MessageRole.USER)) in chat_enginer, but no such thing in AgentRunner, do you think it's a bug? or am I missing something?

@logan-markewich

@samayra2029
Copy link

@risos8200 that's not when it gets pushed? It gets pushed on every .chat() call?

@risos8200 are you able to get messages showing up at Redis side? Even I am not seeing this updated at Redis side. I am using stream_chat call.

@usmanmukhtarml
Copy link

Expected type 'BaseChatStore | None', got 'RedisChatStore' instead

getting above mentioned error on the following line

memory = ChatMemoryBuffer.from_defaults(
token_limit=5000,
chat_store=redis_chat_store,
chat_store_key=user_uuid
)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants