diff --git a/README.md b/README.md
index c062352..9aaf906 100644
--- a/README.md
+++ b/README.md
@@ -2,6 +2,7 @@
OllamaSpring is a comprehensive macOS client for managing the various models offered by the Ollama community (now with support for Groq API services), and for creating conversational AI experiences. This is an open-source and free software project, and we welcome users and developers to participate.
- Supports all Ollama Models
+- Supports Ollama Http Host config
- Stream Response Control
- Model Download and Deletion
- Conversation and History Contexts
@@ -30,6 +31,10 @@ If your Mac is not powerful enough to run the Ollama open-source models locally,
If your network accesses the internet through an HTTP proxy, you can now configure it using the HTTP Proxy feature in OllamaSpring (available in the toolbar).
+### Ollama Http Host Config
+
+
+
### Quick Completion
Quick Completion allows you to send prompts quickly by activating it with cmd + shift + h. Update OllamaSpring to v1.1.5+ or install it from the [Releases](https://github.com/CrazyNeil/OllamaSpring/releases) section.
@@ -42,7 +47,7 @@ System Requirements:
- macOS 14.0 or later
- [Ollama](https://ollama.com) installed
-Download the latest release package (v1.2.2) from the [Releases](https://github.com/CrazyNeil/OllamaSpring/releases) section. Simply unzip the package and drag it into your Applications folder, or install the sandbox version (Taify) (v1.2.1) from the App Store. _Note: The sandbox version is subject to Apple App Store review. For the latest updates, we recommend using the binary installation package._
+Download the latest release package (v1.2.3) from the [Releases](https://github.com/CrazyNeil/OllamaSpring/releases) section. Simply unzip the package and drag it into your Applications folder, or install the sandbox version (Taify) (v1.2.2) from the App Store. _Note: The sandbox version is subject to Apple App Store review. For the latest updates, we recommend using the binary installation package._