Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bedrock integration not working with Claude 3.5 Sonnet. Errors with "claude-3-5-sonnet-20240620" is not supported on this API. Please use the Messages API instead. #6845

Closed
5 tasks done
stevedaskam opened this issue Sep 18, 2024 · 6 comments
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@stevedaskam
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { Bedrock } from "@langchain/community/llms/bedrock";
import { BedrockEmbeddings } from "@langchain/aws";
import { Client } from "@opensearch-project/opensearch";
import { OpenSearchVectorStore } from "@langchain/community/vectorstores/opensearch";
import {
  ChatPromptTemplate,
  MessagesPlaceholder,
} from "@langchain/core/prompts";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";

const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";

const model = new Bedrock({
    model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
    region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
    temperature: 0,
    maxRetries: 2,
  });

const embeddings = new BedrockEmbeddings({
    maxRetries: 5,
    region: region,
    model: "amazon.titan-embed-text-v2:0",
  });
  
  const client = new Client({
    ...AwsSigv4Signer({
      region: "us-east-1",
      service: "aoss",
      getCredentials: () => {
        const credentialsProvider = defaultProvider();
        return credentialsProvider();
      },
    }),
    node: endpoint
  });

const vectorStore = new OpenSearchVectorStore(embeddings, {
    // OpenSearch Serverless configuration
    client: client,
    indexName: "my-index",
    service: 'aoss'
  });

const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
    Chat History:
    {chat_history}
    Follow Up Input: {input}
    Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);

const historyAwareRetriever = await createHistoryAwareRetriever({
    llm: model,
    retriever: vectorStore.asRetriever(),
    rephrasePrompt: contextualizeQPrompt,
  });

const chat_history = [
    "\n\nHuman: What is Amazon S3? \n\nAssistant:",
  ];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history });

Error Message and Stack Trace (if applicable)

{ "errorType": "Error", "errorMessage": "Error 400: \"claude-3-5-sonnet-20240620\" is not supported on this API. Please use the Messages API instead.", "trace": [ "Error: Error 400: \"claude-3-5-sonnet-20240620\" is not supported on this API. Please use the Messages API instead.", " at Bedrock._call (file:///opt/nodejs/node_modules/@langchain/community/dist/llms/bedrock/web.js:181:19)", " at process.processTicksAndRejections (node:internal/process/task_queues:95:5)", " at async Promise.all (index 0)", " at async Bedrock._generate (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:354:29)", " at async Bedrock._generateUncached (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:169:26)", " at async Bedrock.invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:32:24)", " at async RunnableSequence.invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/base.js:1144:33)", " at async RunnableBranch._invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/branch.js:129:22)", " at async RunnableBranch._callWithConfig (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/base.js:217:22)", " at async Runtime.handler (file:///var/task/index.mjs:149:18)" ] }

Description

I'm trying to use Bedrock with Claude 3.5 using the createHistoryAwareRetriever and getting the above error. I see that this issue was fixed in the python version of langchain, but didn't find any mention of it in this repo.

Related issue:
langchain-ai/langchain#18513

System Info

Running this on AWS lambda, Node.js 20.x

Lambda layer has these dependencies:
"dependencies": {
"@aws-crypto/sha256-js": "^5.2.0",
"@aws-sdk/client-dynamodb": "^3.651.1",
"@aws-sdk/client-opensearch": "^3.637.0",
"@aws-sdk/credential-provider-node": "^3.637.0",
"@aws-sdk/node-http-handler": "^3.374.0",
"@aws-sdk/protocol-http": "^3.374.0",
"@aws-sdk/signature-v4": "^3.374.0",
"@langchain/aws": "^0.0.10",
"@langchain/community": "^0.2.31",
"@opensearch-project/opensearch": "^3.0.0-beta.5",
"aws-sdk": "^2.1686.0",
"langchain": "^0.2.17"
}

Copy link

dosubot bot commented Sep 18, 2024

Hey @stevedaskam! I'm here to help you out with any bugs, questions, or contributions you might have. Let's work together to solve this issue!

I found a similar issue that might help resolve your problem. The error you're encountering is due to using an incorrect model identifier. The correct model identifier for Claude 3.5 should be "anthropic.claude-3-5-sonnet-20240620-v1:0".

Here's how you can update your code to use the correct model identifier:

import { BedrockChat } from "@langchain/community/chat_models/bedrock";

const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";

const model = new BedrockChat({
    model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
    region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
    temperature: 0,
    maxRetries: 2,
});

const embeddings = new BedrockEmbeddings({
    maxRetries: 5,
    region: region,
    model: "amazon.titan-embed-text-v2:0",
});

const client = new Client({
    ...AwsSigv4Signer({
        region: "us-east-1",
        service: "aoss",
        getCredentials: () => {
            const credentialsProvider = defaultProvider();
            return credentialsProvider();
        },
    }),
    node: endpoint
});

const vectorStore = new OpenSearchVectorStore(embeddings, {
    client: client,
    indexName: "my-index",
    service: 'aoss'
});

const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
    Chat History:
    {chat_history}
    Follow Up Input: {input}
    Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);

const historyAwareRetriever = await createHistoryAwareRetriever({
    llm: model,
    retriever: vectorStore.asRetriever(),
    rephrasePrompt: contextualizeQPrompt,
});

const chat_history = [
    "\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history });

Ensure that you replace "claude-3-5-sonnet-20240620" with "anthropic.claude-3-5-sonnet-20240620-v1:0" in your code [1][2][3][4].

To continue talking to Dosu, mention @dosu.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Sep 18, 2024
@stevedaskam
Copy link
Author

I am already using that model ID just with Bedrock, not BedrockChat

Copy link

dosubot bot commented Sep 18, 2024

To resolve the issue, you need to use the BedrockChat class instead of the Bedrock class for the Anthropic model. The Bedrock class does not support the claude-3-5-sonnet-20240620 model. Here is how you can update your code:

import { BedrockChat } from "@langchain/community/chat_models/bedrock";
import { BedrockEmbeddings } from "@langchain/aws";
import { Client } from "@opensearch-project/opensearch";
import { OpenSearchVectorStore } from "@langchain/community/vectorstores/opensearch";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";

const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";

const model = new BedrockChat({
    model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
    region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
    temperature: 0,
    maxRetries: 2,
});

const embeddings = new BedrockEmbeddings({
    maxRetries: 5,
    region: region,
    model: "amazon.titan-embed-text-v2:0",
});

const client = new Client({
    ...AwsSigv4Signer({
        region: "us-east-1",
        service: "aoss",
        getCredentials: () => {
            const credentialsProvider = defaultProvider();
            return credentialsProvider();
        },
    }),
    node: endpoint
});

const vectorStore = new OpenSearchVectorStore(embeddings, {
    client: client,
    indexName: "my-index",
    service: 'aoss'
});

const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
    Chat History:
    {chat_history}
    Follow Up Input: {input}
    Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);

const historyAwareRetriever = await createHistoryAwareRetriever({
    llm: model,
    retriever: vectorStore.asRetriever(),
    rephrasePrompt: contextualizeQPrompt,
});

const chat_history = [
    "\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history });

This should resolve the "Error 400: 'claude-3-5-sonnet-20240620' is not supported on this API" issue by using the correct BedrockChat class for the Anthropic model [1][2].

To continue talking to Dosu, mention @dosu.

@stevedaskam
Copy link
Author

Will this work with ChatBedrockConverse?

Copy link

dosubot bot commented Sep 18, 2024

Yes, this will work with ChatBedrockConverse. Here is an example of how you can update your code to use ChatBedrockConverse:

import { ChatBedrockConverse } from "@langchain/aws";
import { BedrockEmbeddings } from "@langchain/aws";
import { Client } from "@opensearch-project/opensearch";
import { OpenSearchVectorStore } from "@langchain/community/vectorstores/opensearch";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";

const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";

const model = new ChatBedrockConverse({
    model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
    region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
    temperature: 0,
    maxRetries: 2,
    credentials: {
        accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
        secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
    },
});

const embeddings = new BedrockEmbeddings({
    maxRetries: 5,
    region: region,
    model: "amazon.titan-embed-text-v2:0",
});

const client = new Client({
    ...AwsSigv4Signer({
        region: "us-east-1",
        service: "aoss",
        getCredentials: () => {
            const credentialsProvider = defaultProvider();
            return credentialsProvider();
        },
    }),
    node: endpoint
});

const vectorStore = new OpenSearchVectorStore(embeddings, {
    client: client,
    indexName: "my-index",
    service: 'aoss'
});

const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
    Chat History:
    {chat_history}
    Follow Up Input: {input}
    Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);

const historyAwareRetriever = await createHistoryAwareRetriever({
    llm: model,
    retriever: vectorStore.asRetriever(),
    rephrasePrompt: contextualizeQPrompt,
});

const chat_history = [
    "\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history });

This should resolve the issue by using the ChatBedrockConverse class, which supports the claude-3-5-sonnet-20240620 model [1][2].

To continue talking to Dosu, mention @dosu.

@bracesproul
Copy link
Member

Hey @stevedaskam the issue appears to be you're importing from /llms/ instead of /chat_models/. These are two different sets of integrations, with Claude 3.5 Sonnet falling into the chat models area. In most cases, you'll want to use a chat model, and will rarely need an LLM model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants