Skip to content

Latest commit

 

History

History
22 lines (11 loc) · 1.81 KB

File metadata and controls

22 lines (11 loc) · 1.81 KB

Contributing third-party LLMs

This page contains some specific guidelines and examples for contributing integrations with third-party LLM providers.

Make sure you read the general guidelines page first!

Example PR

We'll be referencing this PR adding Amazon SageMaker endpoints as an example: #1267

General ideas

The general idea for adding new third-party LLMs is to subclass the LLM class and implement the _call method. As the name suggests, this method should call the LLM with the given prompt and transform the LLM response into some generated string output.

The example PR for Amazon SageMaker is an interesting example of this because SageMaker endpoints can host a wide variety of models with non-standard input and output formats. Therefore, the contributor added a simple abstract class that a user can implement depending on which specific model they are hosting that transforms input from LangChain into a format expected by the model and output into a plain string.

Other third-party providers like OpenAI and Anthropic will have a defined input and output format, and in those cases, the input and output transformations should happen within the _call method.

Wrap LLM requests in this.caller

The base LLM class contains an instance property called caller that will automatically handle retries, errors, timeouts, and more. You should wrap calls to the LLM in this.caller.call as shown here