Skip to content
This repository has been archived by the owner on Mar 6, 2024. It is now read-only.

Commit

Permalink
Support gpt 3.5 turbo 16k model (#424)
Browse files Browse the repository at this point in the history
`TokenLimits` is only the place needed to be modified. Have set the
token limits accordingly. Closes #406
<!-- This is an auto-generated comment: release notes by OSS CodeRabbit
-->
### Summary by CodeRabbit

**New Feature:**
- Added support for the "gpt-3.5-turbo-16k" model in the `TokenLimits`
class.
- Set the `maxTokens` limit to 16300 and the `responseTokens` limit to
3000 for the new model.

> 🎉 With tokens aplenty, we set the stage,
> For the "gpt-3.5-turbo-16k" to engage.
> More power, more wisdom, in every page,
> A new chapter begins, let's turn the page! 🚀
<!-- end of auto-generated comment: release notes by OSS CodeRabbit -->
  • Loading branch information
HyunggyuJang authored Aug 11, 2023
1 parent 500adcb commit 4c02adf
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions src/limits.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,9 @@ export class TokenLimits {
if (model === 'gpt-4-32k') {
this.maxTokens = 32600
this.responseTokens = 4000
} else if (model === 'gpt-3.5-turbo-16k') {
this.maxTokens = 16300
this.responseTokens = 3000
} else if (model === 'gpt-4') {
this.maxTokens = 8000
this.responseTokens = 2000
Expand Down

0 comments on commit 4c02adf

Please sign in to comment.