Replies: 3 comments 3 replies
-
Not officially ( I don't think so ), but I was planning on giving it a go after #3538 because it enables proper control/special token usage in prompt "templates". Some kind of very simple per model prompt template / library system would be nice to have, I just need to figure out a way to do it cleanly and non-invasively. |
Beta Was this translation helpful? Give feedback.
-
You need to explicitly add BOS if the first thing in the prompt would otherwise be another token identifier, in this case, 32001. The client then passes this prompt to the server, and everything else is as usual. It would be better if identifiers of special tokens could be queried from the server instead of having them in client code. |
Beta Was this translation helpful? Give feedback.
-
@zhibor it is merged to the main now with --chatml flag |
Beta Was this translation helpful? Give feedback.
-
Is it on the roadmap to support ChatML proposed by openai?
Beta Was this translation helpful? Give feedback.
All reactions