Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix ChatCompletionCreateResponse handling multiple parallel tool call #494

Merged
merged 5 commits into from
Apr 3, 2024

Conversation

David-Buyer
Copy link
Contributor

As reported in #493 this could fix the issue.

@kayhantolga
Copy link
Member

Hi @David-Buyer ,

Do you think the code will handle it if the stream returns chunks with shuffled indexes, rather than in order? I mean, like this:

1. chunk index 0 - data
2. chunk index 0 - data
3. chunk index 1 - data
4. chunk index 0 - data
5. chunk index 1 - data
6. chunk index 1 - data

@kayhantolga kayhantolga added the bug Something isn't working label Apr 2, 2024
@kayhantolga kayhantolga added this to the 7.4.7 milestone Apr 2, 2024
@David-Buyer
Copy link
Contributor Author

Hi @David-Buyer ,

Do you think the code will handle it if the stream returns chunks with shuffled indexes, rather than in order? I mean, like this:

1. chunk index 0 - data
2. chunk index 0 - data
3. chunk index 1 - data
4. chunk index 0 - data
5. chunk index 1 - data
6. chunk index 1 - data

Hello @kayhantolga, i don't think it handles that case. It seems to expect to moving foward in sequential order to group arguments, supposing they belong to the last function tool added previously to the list. Do you think that case would be expected where data could be received shuffled ?

@kayhantolga
Copy link
Member

I can't say it's impossible due to the lack of OpenAI documentation, and it's also challenging to experiment. However, I think if they decide to improve the model to allow both tool to return asynchronously, we may see shuffled data. If it's not too much of a headache for you, we could take a defensive approach and handle such cases.
If you don't want to improve, I would merge the branch as it is, and we can try to fix this later.

@David-Buyer
Copy link
Contributor Author

I can't say it's impossible due to the lack of OpenAI documentation, and it's also challenging to experiment. However, I think if they decide to improve the model to allow both tool to return asynchronously, we may see shuffled data. If it's not too much of a headache for you, we could take a defensive approach and handle such cases. If you don't want to improve, I would merge the branch as it is, and we can try to fix this later.

It's a reasonable way to proceed and moving in advance this optimization. At the moment this fix could be useful as it is, in order to fix the issue some users claimed, so you can merge it you want to provide a partial improvement. Related to challenge you told, i already have an idea and i will try asap keeping you updated too :)

@David-Buyer
Copy link
Contributor Author

I added an improvement which could handle the asynchronous response about shuffled data.

@@ -27,4 +27,11 @@ public class ToolCall
/// </summary>
[JsonPropertyName("function")]
public FunctionCall? FunctionCall { get; set; }

public void Deconstruct(out int index, out string? id, out string? type)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we need this @David-Buyer ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a refuse i forgot, cause i've attempted another approach before. You can remove it, it's not useful for the moment.

…d from the `ToolCall` class. This method was previously used to deconstruct the `ToolCall` object into its `index`, `id`, and `type` properties.

List of Changes:
1. Removed the `Deconstruct` method from the `ToolCall` class. This method was responsible for breaking down the `ToolCall` object into its `index`, `id`, and `type` properties.

References to the code changes:
- Removal of `Deconstruct` method from `ToolCall` class.
@kayhantolga kayhantolga merged commit 99826f3 into betalgo:dev Apr 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ChatCompletionCreateResponse incorrectly deserializes multiple tool calls when streaming
2 participants