llama.cpp "chat" Qt GUI #602
niansa
started this conversation in
Show and tell
Replies: 1 comment 2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey!
I've sat down to create a simple llama.cpp GUI for few-shot prompts in Qt today:
(this is 7B)
I've tested it on both Linux and Windows, and it should work on Mac OS X too. It visualizes markdown and supports multi-line reponses now.
I want to add further customization options, as currently this is all there is for now:

Currently the GUI runs the model locally, but I plan on adding an option to run the inference on a (potentially much stronger) server instead. However this will wait until processing multiple prompts in the same context is implemented.
I am also considering to add alpaca/gpt4all support.
You can find the project here: https://gitlab.com/niansa/llamaq
Hope you have fun with this
niansa
Beta Was this translation helpful? Give feedback.
All reactions