Replies: 3 comments 3 replies
-
#471 might be relevant here. does this work for e.g. openai? |
Beta Was this translation helpful? Give feedback.
1 reply
-
Hmm, ideally, it would be possible to set this per request/org-thread/file in some variable so it's easily changeable. |
Beta Was this translation helpful? Give feedback.
0 replies
-
See gptel-temperature. You can turn on gptel-expert-commands and set the
temperature from the transient menu. You can set it globally, per buffer,
per request (flip the "scope" switch) or in your file (just save your chat
buffer).
There's no UI option for setting the seed.
Yes, #471 works for all backends. You can use it to specify any request
parameter you want, per backend and per model.
…On Mon, Jan 6, 2025, 8:58 AM Hauke Rehfeld ***@***.***> wrote:
#471 <#471> might be
relevant here. does this work for e.g. openai?
—
Reply to this email directly, view it on GitHub
<#549 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACBVOLA5DYVCTOB75CAB5ST2JKYZ7AVCNFSM6AAAAABUV7NAXSVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCNZVGE2DKMY>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
hrehfeld
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How can I specify seed & temperature for my requests?
Beta Was this translation helpful? Give feedback.
All reactions