This Python application demonstrates how to use the Ollama library to create a chat interface with additional functionalities like weather retrieval and number comparison.
- Interactive chat with an AI model (llama3.1:8b-instruct-fp16)
- Function calling capabilities
- Weather information retrieval
- Number comparison
- Python 3.x
ollama
libraryrequests
library- Ollama model: llama3.1:8b-instruct-fp16
- Clone this repository or download the script.
- Install the required libraries:
pip install ollama requests
- Install Ollama and download the required model:
- Visit the official Ollama models webste
- Follow the installation instructions for your operating system
- Once Ollama is installed, open a terminal and run:
ollama pull llama3.1:8b-instruct-fp16
Run the script using Python:
python ollama_chat.py
The application will start an interactive chat session. You can ask questions or use the following functionalities:
- Get current weather for a city
- Compare two numbers
To exit the application, type 'quit'.
Retrieves the current temperature for a specified city using the wttr.in API.
Compares two numbers and returns which one is bigger or if they are the same.
Sends a user question to the Ollama model and handles function calls if applicable.
Sends a user question to the Ollama model without function calling capabilities.
This application implements function calling, a feature that allows the AI model to invoke specific functions based on the user's input. Here's how it works:
-
The
chat_with_ollama()
function sends the user's question to the Ollama model along with a list of available tools (functions). -
If the model determines that a function call is necessary to answer the user's question, it returns a
tool_calls
object in its response. -
The main loop checks for the presence of
tool_calls
in the model's response. -
If
tool_calls
are present, the application iterates through them and executes the corresponding functions with the provided arguments. -
The results of these function calls are then printed to the user.
-
If no
tool_calls
are present or if the arguments are invalid, the application falls back to using the model's direct response.
This approach allows the AI to leverage external data sources (like weather information) or perform specific computations (like number comparison) when needed, enhancing its ability to provide accurate and relevant responses.
The main()
function runs an infinite loop that:
- Prompts the user for input
- Sends the input to the Ollama model
- Processes any function calls returned by the model
- Displays the result or the model's response
This application requires an active internet connection to communicate with the Ollama API and retrieve weather information.
Feel free to suggest changes to this example code.