-
Notifications
You must be signed in to change notification settings - Fork 11k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Server: Fix system_prompt handling #7153
Conversation
@ggerganov @phymbert In addition to this, I found a problem that any user can change the system prompt (which will affect other users using the same server). This poses a small security risk. Do you think that we need to introduce some kind of "system prompt lock" in the future? (maybe some kinds of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I never used system prompt, so I have no strong position.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, the system prompt can be changed by anyone. Maybe it's better to have it locked by default and be able to change it only if a specific argument is added when starting server
System message == system prompt? Llama 3 and OpenAI chatml specify system role and messages in their APIs. Sometimes AKA system prompt? The ability to include role="system" from the application side is essential to many features being implemented in applications which achieve outcomes often by inserting a detailed system message at the head of the chat, and the system message can be changed from one request to the next even in the same chat. https://wegrok.ai uses system messages, and I am sure openAI uses them for their "Custom instructions" and meta.ai uses them for various chat starters. I would argue the ability to lock out system messages in the APIs is undesirable. The system messages do not impact the model or other users unless they are applied at the server level, but that is not the usual case afaict. |
@scottstirling Having system message as system prompt can be useful, but only when we can correctly format it. For the moment, that will be quite messy to implement, so I'd rather not to have this feature and in fact, leave it to application layer to handle (i.e. you can have a "proxy" between llama.cpp server and frontend) Also, just to be clear, this idea is not the same as what you're mentioning. https://wegrok.ai (and all other APIs) evaluate the system message for each input sequence, not all sequences, so each user can set their own system message. If they used the same idea as @phymbert Because a while ago you mentioned that you're using server in a prod-like environment, I think it's better to also ask you about the risk. But if that's not a big problem so you can ignore that. Thank you anyway for the confirmation. |
Resolves #7152 #7089
assistant_name
anduser_name
are assigned but unused ==> they are now removed. Users can use"stop"
to specify the antipromptsystem_prompt
will now simply be a string