Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Langchain ValidationError: str type expected #19037

Closed
5 tasks done
ntvuongg opened this issue Mar 13, 2024 · 1 comment
Closed
5 tasks done

Langchain ValidationError: str type expected #19037

ntvuongg opened this issue Mar 13, 2024 · 1 comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules

Comments

@ntvuongg
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

template = """<s>[INST] <<SYS>>
- Bạn là một trợ lí Tiếng Việt nhiệt tình và trung thực.
- Hãy luôn trả lời một cách hữu ích nhất có thể, đồng thời giữ an toàn.
- Câu trả lời của bạn không nên chứa bất kỳ nội dung gây hại, phân biệt chủng tộc, phân biệt giới tính, độc hại, nguy hiểm hoặc bất hợp pháp nào.
- Hãy đảm bảo rằng các câu trả lời của bạn không có thiên kiến xã hội và mang tính tích cực.
- Nếu một câu hỏi không có ý nghĩa hoặc không hợp lý về mặt thông tin, hãy giải thích tại sao thay vì trả lời một điều gì đó không chính xác.
- Nếu bạn không biết câu trả lời cho một câu hỏi, hãy trẳ lời là bạn không biết và vui lòng không chia sẻ thông tin sai lệch.
- Hãy trả lời một cách ngắn gọn, súc tích và chỉ trả lời chi tiết nếu được yêu cầu.
<</SYS>>

{question} [/INST]"""

@tool
def time(text: str) -> str:
    """Returns todays date, use this for any \
    questions related to knowing todays date. \
    The input should always be an empty string, \
    and this function will always return todays \
    date - any date mathmatics should occur \
    outside this function."""
    return str(date.today())

prompt = PromptTemplate(template=template, input_variables=["question"])
callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])
llm = ChatOllama(model="vistral-7b-q8", temperature=0.0, callback_manager=callback_manager)
llm_chain = LLMChain(prompt=prompt, llm=llm, output_parser=StrOutputParser()) 
tools = load_tools(["ddg-search", "wikipedia"], llm=llm_chain)
agent= initialize_agent(
    tools + [time], 
    llm_chain, 
    agent=AgentType.CHAT_ZERO_SHOT_REACT_DESCRIPTION,
    handle_parsing_errors=True,
    verbose = True)
    
 agent("Hôm nay ngày mấy?")

Error Message and Stack Trace (if applicable)

File ~/miniconda3/envs/stt/lib/python3.9/site-packages/langchain_core/_api/deprecation.py:145, in deprecated..deprecate..warning_emitting_wrapper(*args, **kwargs)
143 warned = True
144 emit_warning()
--> 145 return wrapped(*args, **kwargs)

File ~/miniconda3/envs/stt/lib/python3.9/site-packages/langchain/chains/base.py:378, in Chain.call(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info)
346 """Execute the chain.
347
348 Args:
(...)
369 Chain.output_keys.
370 """
371 config = {
372 "callbacks": callbacks,
373 "tags": tags,
374 "metadata": metadata,
375 "run_name": run_name,
376 }
--> 378 return self.invoke(
379 inputs,
...
343 object_setattr(pydantic_self, 'dict', values)

ValidationError: 1 validation error for Generation
text
str type expected (type=type_error.str)

Description

I got problem while using Langchain with LLM model to get output.

System Info

System Information

OS: Linux
OS Version: #50-Ubuntu SMP PREEMPT_DYNAMIC Mon Jul 10 18:24:29 UTC 2023
Python Version: 3.9.18 (main, Sep 11 2023, 13:41:44)
[GCC 11.2.0]

Package Information

langchain_core: 0.1.31
langchain: 0.1.12
langchain_community: 0.0.28
langsmith: 0.1.23
langchain_experimental: 0.0.54
langchain_openai: 0.0.8
langchain_text_splitters: 0.0.1
langchainhub: 0.1.15

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph
langserve

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Mar 13, 2024
@keenborder786
Copy link
Contributor

Hey the intialize agent does not expect an LLM Chain, it expects BaseLanguageModel which in your case is ChatOllama therefore you need to change your code as following:

prompt = PromptTemplate(template=template, input_variables=["question"])
callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])
llm = ChatOllama(model="istral-7b-q8", temperature=0.0, callback_manager=callback_manager)
agent= initialize_agent( # This will create the chain.
    [tools + [time], 
    llm,  # pass the llm
    agent=AgentType.CHAT_ZERO_SHOT_REACT_DESCRIPTION,
    handle_parsing_errors=True,
    verbose = True,
    agent_kwargs={'prefix':template} # pass the template as prefix)
    
 agent("Hôm nay ngày mấy?")

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jun 13, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Jun 20, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jun 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

2 participants