Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bedrock integration with Claude3 fails with ValidationException "claude-3-sonnet-20240229" is not supported on this API. Please use the Messages API instead. #18513

Closed
5 tasks done
Barneyjm opened this issue Mar 4, 2024 · 8 comments · Fixed by #18630
Labels
🔌: anthropic Primarily related to Anthropic integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules

Comments

@Barneyjm
Copy link
Contributor

Barneyjm commented Mar 4, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import boto3
from langchain_community.llms import Bedrock

bedrock = boto3.client('bedrock-runtime' , 'us-east-1')

MODEL_KWARGS = {
"anthropic.claude-3-sonnet-20240229-v1:0": {
        "temperature": 0, 
        "top_k": 250, 
        "top_p": 1, 
        "max_tokens_to_sample": 2**10 
}}

model_id = 'anthropic.claude-3-sonnet-20240229-v1:0'
llm = Bedrock(model_id=model_id, model_kwargs=MODEL_KWARGS[model_id])
llm('tell me a joke')

Error Message and Stack Trace (if applicable)

.venv/lib/python3.11/site-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.7 and will be removed in 0.2.0. Use invoke instead.
  warn_deprecated(
Traceback (most recent call last):
  File ".venv/lib/python3.11/site-packages/langchain_community/llms/bedrock.py", line 444, in _prepare_input_and_invoke
    response = self.client.invoke_model(**request_options)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/botocore/client.py", line 553, in _api_call
    return self._make_api_call(operation_name, kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/botocore/client.py", line 1009, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the InvokeModel operation: "claude-3-sonnet-20240229" is not supported on this API. Please use the Messages API instead.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File ".venv/lib/python3.11/site-packages/langchain_core/_api/deprecation.py", line 145, in warning_emitting_wrapper
    return wrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 991, in __call__
    self.generate(
  File ".venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 741, in generate
    output = self._generate_helper(
             ^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 605, in _generate_helper
    raise e
  File ".venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 592, in _generate_helper
    self._generate(
  File ".venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 1177, in _generate
    self._call(prompt, stop=stop, run_manager=run_manager, **kwargs)
  File ".venv/lib/python3.11/site-packages/langchain_community/llms/bedrock.py", line 718, in _call
    return self._prepare_input_and_invoke(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/langchain_community/llms/bedrock.py", line 451, in _prepare_input_and_invoke
    raise ValueError(f"Error raised by bedrock service: {e}")
ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: "claude-3-sonnet-20240229" is not supported on this API. Please use the Messages API instead.

Description

obviously claude3 is brand new, but initial testing with existing capabilities seems to indicate a change in how these models need to be invoked.

I'd expect that these new models would work with existing langchain capabilities as drop-in improvements.

System Info

System Information

OS: Linux
OS Version: #50~20.04.1-Ubuntu SMP Wed Sep 6 17:29:11 UTC 2023
Python Version: 3.11.4 (main, Aug 9 2023, 21:54:01) [GCC 10.2.1 20210110]

Package Information

langchain_core: 0.1.28
langchain: 0.1.10
langchain_community: 0.0.25
langsmith: 0.1.14
langchain_text_splitters: 0.0.1
langchainhub: 0.1.14

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph
langserve

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🔌: anthropic Primarily related to Anthropic integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Mar 4, 2024
@efunneko
Copy link

efunneko commented Mar 4, 2024

Not too surprisingly, this same issue occurs going through the BedrockChat interface.

@miroslavtushev
Copy link

I've opened the same issue with what I believe is causing this issue: #18514

@Barneyjm
Copy link
Contributor Author

Barneyjm commented Mar 4, 2024

considering the new capabilities of Claude3 and the input types (images etc), this likely requires a rework or perhaps a "text" handler and "image" handler

@Barneyjm
Copy link
Contributor Author

Barneyjm commented Mar 4, 2024

this might be an issue with Bedrock itself, the API endpoint is expecting a prompt keyword in the body, but the Anthropic Messaging API expects content.

When you pass content you get an unexpected keyword error but if you format the prompt in the messaging format, you get prompt must start with Human: errors

langchain_community/llms/bedrock.py:415
...
input_body = LLMInputOutputAdapter.prepare_input(provider, prompt, params)
input_body['content'] = input_body['prompt']

with content key:

ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: #: extraneous key [content] is not permitted, please reformat your input and try again.

with prompt formatted to messages api spec:

ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: prompt must start with "\n\nHuman:" turn after an optional system prompt

@Barneyjm
Copy link
Contributor Author

Barneyjm commented Mar 5, 2024

talked with AWS support about the new format of Claude requests. here's the input body that works from boto3==1.34.54

{
  "max_tokens": 1024, 
  "system": "Today is January 1, 2024. Only respond in Haiku", 
  "messages": [{"role": "user", "content": "Hello, Claude"}], 
  "anthropic_version": "bedrock-2023-05-31"
}

@jobcase-plebedev
Copy link

It would be very helpful if Claude 3 worked with RetrievalQA - I can use the same code with Claude 2.1 or OpenAI as long as I get LLMs appropriately (i.e. Bedrock(model_id="anthropic.claude-v2:1..." or ChatOpenAI but I can't use the same approach with Claude 3.

@Barneyjm
Copy link
Contributor Author

Barneyjm commented Mar 5, 2024

starting to take a swing at it here #18548

@Mantej-Singh
Copy link

retrieveAndGenerate is also not supported with anthropic.claude-3-sonnet-20240229-v1:0

Error:
ValidrationException: An error occurred (ValidationException) when calling the RetrieveAndGenerate operation: The model arn provided is not supported. Please check your configuration and retry the request.

efriis added a commit that referenced this issue Mar 6, 2024
Fixes #18513.

## Description
This PR attempts to fix the support for Anthropic Claude v3 models in
BedrockChat LLM. The changes here has updated the payload to use the
`messages` format instead of the formatted text prompt for all models;
`messages` API is backwards compatible with all models in Anthropic, so
this should not break the experience for any models.


## Notes
The PR in the current form does not support the v3 models for the
non-chat Bedrock LLM. This means, that with these changes, users won't
be able to able to use the v3 models with the Bedrock LLM. I can open a
separate PR to tackle this use-case, the intent here was to get this out
quickly, so users can start using and test the chat LLM. The Bedrock LLM
classes have also grown complex with a lot of conditions to support
various providers and models, and is ripe for a refactor to make future
changes more palatable. This refactor is likely to take longer, and
requires more thorough testing from the community. Credit to PRs
[18579](#18579) and
[18548](#18548) for some
of the code here.

---------

Co-authored-by: Erick Friis <[email protected]>
gkorland pushed a commit to FalkorDB/langchain that referenced this issue Mar 30, 2024
Fixes langchain-ai#18513.

## Description
This PR attempts to fix the support for Anthropic Claude v3 models in
BedrockChat LLM. The changes here has updated the payload to use the
`messages` format instead of the formatted text prompt for all models;
`messages` API is backwards compatible with all models in Anthropic, so
this should not break the experience for any models.


## Notes
The PR in the current form does not support the v3 models for the
non-chat Bedrock LLM. This means, that with these changes, users won't
be able to able to use the v3 models with the Bedrock LLM. I can open a
separate PR to tackle this use-case, the intent here was to get this out
quickly, so users can start using and test the chat LLM. The Bedrock LLM
classes have also grown complex with a lot of conditions to support
various providers and models, and is ripe for a refactor to make future
changes more palatable. This refactor is likely to take longer, and
requires more thorough testing from the community. Credit to PRs
[18579](langchain-ai#18579) and
[18548](langchain-ai#18548) for some
of the code here.

---------

Co-authored-by: Erick Friis <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🔌: anthropic Primarily related to Anthropic integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants