-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why are the tokens counted differently than OpenAI? #2
Comments
Ah! Found |
Ungh, now I see that OpenAI has a separate token counter in their batch-API script than in their tutorial Notebook: https://github.com/openai/openai-cookbook/blob/main/examples/api_request_parallel_processor.py#L339 Shouldn't a rate limiter, in any case, be updated after the actual completion is returned by the API, to account for the actual number of output tokens? |
Hi, |
Thanks. It looks like there are two contradictory implementations from
OpenAI: one in the notebook, the other in their batch call.
They differ not just by the accounting for max_count, but also by how they
handle the role names depending on the model (admittedly, a rather smaller
factor!)
…On Tue, Jul 25, 2023, 13:18 Youssef Benhammouda ***@***.***> wrote:
Hi,
I will dig into the OpenAI notebooks and update the implémentation if
necessary.
As far as I know, each request's tokens are calculated this way:
Prompt tokens + Max tokens = request total tokens.
—
Reply to this email directly, view it on GitHub
<#2 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAFBERK5LVHHK54RCKYFFSLXR62RTANCNFSM6AAAAAA2WHFFPA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Hi @blaze-Youssef !
First, thanks for the tool, very useful, as I'm stuck in
openlimit
by similar issues that you met in shobrook/openlimit#4 .However, I'm a bit stuck trying to understand. What does the argument
max_tokens
correspond to, please, in https://github.com/blaze-Youssef/openai-ratelimiter/blob/main/openai_ratelimiter/defs.py#L9 ?I am trying to understand it, but the way you count tokens, which is the same as in
openlimit
https://github.com/shobrook/openlimit/blob/master/openlimit/utilities/token_counters.py#L14 , is different than in OpenAI's cookbook https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb (see section 6).Would you or @shobrook be able to help clarify this counting, please?
The text was updated successfully, but these errors were encountered: