Skip to content

Commit

Permalink
🦙 docs: Update Ollama + LiteLLM Instructions (#2302)
Browse files Browse the repository at this point in the history
* Update litellm.md

* set OPENAI_API_KEY of litellm service (needs to be set if ollama's openai api compatibility is used)
  • Loading branch information
mariusgau authored Apr 4, 2024
1 parent 94950b6 commit 09cd1a7
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 5 deletions.
1 change: 1 addition & 0 deletions docker-compose.override.yml.example
Original file line number Diff line number Diff line change
Expand Up @@ -122,6 +122,7 @@ version: '3.4'
# - ./litellm/litellm-config.yaml:/app/config.yaml
# command: [ "--config", "/app/config.yaml", "--port", "8000", "--num_workers", "8" ]
# environment:
# OPENAI_API_KEY: none ## needs to be set if ollama's openai api compatibility is used
# REDIS_HOST: redis
# REDIS_PORT: 6379
# REDIS_PASSWORD: RedisChangeMe
Expand Down
10 changes: 5 additions & 5 deletions docs/install/configuration/litellm.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,13 +48,13 @@ model_list:
rpm: 1440
- model_name: mixtral
litellm_params:
model: ollama/mixtral:8x7b-instruct-v0.1-q5_K_M
api_base: http://ollama:11434
model: openai/mixtral:8x7b-instruct-v0.1-q5_K_M # use openai/* for ollama's openai api compatibility
api_base: http://ollama:11434/v1
stream: True
- model_name: mistral
litellm_params:
model: ollama/mistral
api_base: http://ollama:11434
model: openai/mistral # use openai/* for ollama's openai api compatibility
api_base: http://ollama:11434/v1
stream: True
litellm_settings:
success_callback: ["langfuse"]
Expand Down Expand Up @@ -95,4 +95,4 @@ Key components and features include:
- **Deployment and Performance**: Information on deploying LiteLLM Proxy and its performance metrics.
- **Proxy CLI Arguments**: A wide range of command-line arguments for customization.

Overall, LiteLLM Server offers a comprehensive suite of tools for managing, deploying, and interacting with a variety of LLMs, making it a versatile choice for large-scale AI applications.
Overall, LiteLLM Server offers a comprehensive suite of tools for managing, deploying, and interacting with a variety of LLMs, making it a versatile choice for large-scale AI applications.

0 comments on commit 09cd1a7

Please sign in to comment.