-
Notifications
You must be signed in to change notification settings - Fork 458
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
fix gemini model set up issue (#414)
- Loading branch information
1 parent
d1b538a
commit a8b07f3
Showing
3 changed files
with
58 additions
and
23 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,13 +1,34 @@ | ||
# run agent with gemini-1.5-flash | ||
run-agent \ | ||
--llm_name llama3:8b \ | ||
--llm_backend ollama \ | ||
--llm_name gemini-1.5-flash \ | ||
--llm_backend google \ | ||
--agent_name_or_path demo_author/demo_agent \ | ||
--task "Tell me what is core idea of AIOS" \ | ||
--aios_kernel_url http://localhost:8000 | ||
|
||
# run agent with gpt-4o-mini using openai | ||
run-agent \ | ||
--llm_name gpt-4o-mini \ | ||
--llm_backend openai \ | ||
--agent_name_or_path demo_author/demo_agent \ | ||
--task "Tell me what is core idea of AIOS" \ | ||
--aios_kernel_url http://localhost:8000 | ||
|
||
# run agent with gpt-4o-mini using openai | ||
vllm serve meta-llama/Meta-Llama-3-8B-Instruct --dtype auto --port 8001 # start the vllm server | ||
run-agent \ | ||
--llm_name meta-llama/Meta-Llama-3-8B-Instruct \ | ||
--llm_backend vllm \ | ||
--agent_name_or_path demo_author/demo_agent \ | ||
--task "Tell me what is core idea of AIOS" \ | ||
--aios_kernel_url http://localhost:8000 | ||
|
||
# run agent with llama3:8b using ollama | ||
ollama pull llama3:8b # pull the model | ||
ollama serve # start the ollama server | ||
run-agent \ | ||
--llm_name llama3:8b \ | ||
--llm_backend ollama \ | ||
--agent_name_or_path demo_author/demo_agent \ | ||
--task "Tell me what is core idea of AIOS" \ | ||
--aios_kernel_url http://localhost:8000 |