How can I add a max_tokens parameter to ensure longer responses? #1234
Unanswered
infinitelyloopy-bt
asked this question in
Q&A
Replies: 1 comment 1 reply
-
When you select the default model using 'fabric --setup' (option 10 if I remember correctly), after choosing the default one the next option is the context length, is that what you are looking for? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am not consistently getting long enough responses, and I realized there is no way to choose Max tokens parameter so that for example I can set it to 14,000.
Does anyone know how to ensure longer responses and or enabling or setting a max tokens limit?
Beta Was this translation helpful? Give feedback.
All reactions