-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bedrock integration not working with Claude 3.5 Sonnet. Errors with "claude-3-5-sonnet-20240620" is not supported on this API. Please use the Messages API instead. #6845
Comments
Hey @stevedaskam! I'm here to help you out with any bugs, questions, or contributions you might have. Let's work together to solve this issue! I found a similar issue that might help resolve your problem. The error you're encountering is due to using an incorrect model identifier. The correct model identifier for Claude 3.5 should be Here's how you can update your code to use the correct model identifier: import { BedrockChat } from "@langchain/community/chat_models/bedrock";
const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";
const model = new BedrockChat({
model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
temperature: 0,
maxRetries: 2,
});
const embeddings = new BedrockEmbeddings({
maxRetries: 5,
region: region,
model: "amazon.titan-embed-text-v2:0",
});
const client = new Client({
...AwsSigv4Signer({
region: "us-east-1",
service: "aoss",
getCredentials: () => {
const credentialsProvider = defaultProvider();
return credentialsProvider();
},
}),
node: endpoint
});
const vectorStore = new OpenSearchVectorStore(embeddings, {
client: client,
indexName: "my-index",
service: 'aoss'
});
const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
Chat History:
{chat_history}
Follow Up Input: {input}
Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);
const historyAwareRetriever = await createHistoryAwareRetriever({
llm: model,
retriever: vectorStore.asRetriever(),
rephrasePrompt: contextualizeQPrompt,
});
const chat_history = [
"\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history }); Ensure that you replace |
I am already using that model ID just with Bedrock, not BedrockChat |
To resolve the issue, you need to use the import { BedrockChat } from "@langchain/community/chat_models/bedrock";
import { BedrockEmbeddings } from "@langchain/aws";
import { Client } from "@opensearch-project/opensearch";
import { OpenSearchVectorStore } from "@langchain/community/vectorstores/opensearch";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";
const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";
const model = new BedrockChat({
model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
temperature: 0,
maxRetries: 2,
});
const embeddings = new BedrockEmbeddings({
maxRetries: 5,
region: region,
model: "amazon.titan-embed-text-v2:0",
});
const client = new Client({
...AwsSigv4Signer({
region: "us-east-1",
service: "aoss",
getCredentials: () => {
const credentialsProvider = defaultProvider();
return credentialsProvider();
},
}),
node: endpoint
});
const vectorStore = new OpenSearchVectorStore(embeddings, {
client: client,
indexName: "my-index",
service: 'aoss'
});
const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
Chat History:
{chat_history}
Follow Up Input: {input}
Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);
const historyAwareRetriever = await createHistoryAwareRetriever({
llm: model,
retriever: vectorStore.asRetriever(),
rephrasePrompt: contextualizeQPrompt,
});
const chat_history = [
"\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history }); This should resolve the "Error 400: 'claude-3-5-sonnet-20240620' is not supported on this API" issue by using the correct |
Will this work with ChatBedrockConverse? |
Yes, this will work with import { ChatBedrockConverse } from "@langchain/aws";
import { BedrockEmbeddings } from "@langchain/aws";
import { Client } from "@opensearch-project/opensearch";
import { OpenSearchVectorStore } from "@langchain/community/vectorstores/opensearch";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";
const region = process.env.AWS_REGION || "us-east-1";
const endpoint = "https://abc123.us-east-1.aoss.amazonaws.com";
const model = new ChatBedrockConverse({
model: "anthropic.claude-3-5-sonnet-20240620-v1:0",
region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
temperature: 0,
maxRetries: 2,
credentials: {
accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!,
},
});
const embeddings = new BedrockEmbeddings({
maxRetries: 5,
region: region,
model: "amazon.titan-embed-text-v2:0",
});
const client = new Client({
...AwsSigv4Signer({
region: "us-east-1",
service: "aoss",
getCredentials: () => {
const credentialsProvider = defaultProvider();
return credentialsProvider();
},
}),
node: endpoint
});
const vectorStore = new OpenSearchVectorStore(embeddings, {
client: client,
indexName: "my-index",
service: 'aoss'
});
const QUESTION_GEN_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
Chat History:
{chat_history}
Follow Up Input: {input}
Standalone question:`;
const contextualizeQPrompt = ChatPromptTemplate.fromTemplate(QUESTION_GEN_TEMPLATE);
const historyAwareRetriever = await createHistoryAwareRetriever({
llm: model,
retriever: vectorStore.asRetriever(),
rephrasePrompt: contextualizeQPrompt,
});
const chat_history = [
"\n\nHuman: What is Amazon S3? \n\nAssistant:",
];
const result = await historyAwareRetriever.invoke({"input": "\n\nHuman: Tell me more \n\nAssistant:", "chat_history": chat_history }); This should resolve the issue by using the |
Hey @stevedaskam the issue appears to be you're importing from |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
{ "errorType": "Error", "errorMessage": "Error 400: \"claude-3-5-sonnet-20240620\" is not supported on this API. Please use the Messages API instead.", "trace": [ "Error: Error 400: \"claude-3-5-sonnet-20240620\" is not supported on this API. Please use the Messages API instead.", " at Bedrock._call (file:///opt/nodejs/node_modules/@langchain/community/dist/llms/bedrock/web.js:181:19)", " at process.processTicksAndRejections (node:internal/process/task_queues:95:5)", " at async Promise.all (index 0)", " at async Bedrock._generate (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:354:29)", " at async Bedrock._generateUncached (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:169:26)", " at async Bedrock.invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/language_models/llms.js:32:24)", " at async RunnableSequence.invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/base.js:1144:33)", " at async RunnableBranch._invoke (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/branch.js:129:22)", " at async RunnableBranch._callWithConfig (file:///opt/nodejs/node_modules/@langchain/core/dist/runnables/base.js:217:22)", " at async Runtime.handler (file:///var/task/index.mjs:149:18)" ] }
Description
I'm trying to use Bedrock with Claude 3.5 using the createHistoryAwareRetriever and getting the above error. I see that this issue was fixed in the python version of langchain, but didn't find any mention of it in this repo.
Related issue:
langchain-ai/langchain#18513
System Info
Running this on AWS lambda, Node.js 20.x
Lambda layer has these dependencies:
"dependencies": {
"@aws-crypto/sha256-js": "^5.2.0",
"@aws-sdk/client-dynamodb": "^3.651.1",
"@aws-sdk/client-opensearch": "^3.637.0",
"@aws-sdk/credential-provider-node": "^3.637.0",
"@aws-sdk/node-http-handler": "^3.374.0",
"@aws-sdk/protocol-http": "^3.374.0",
"@aws-sdk/signature-v4": "^3.374.0",
"@langchain/aws": "^0.0.10",
"@langchain/community": "^0.2.31",
"@opensearch-project/opensearch": "^3.0.0-beta.5",
"aws-sdk": "^2.1686.0",
"langchain": "^0.2.17"
}
The text was updated successfully, but these errors were encountered: