Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: include_feedback breaks when > 10 results AND no expand fields listed #2541

Merged
merged 11 commits into from
Oct 1, 2024

Conversation

tssweeney
Copy link
Collaborator

@tssweeney tssweeney commented Sep 30, 2024

Simplify dynamic batch size calculation in calls stream to be 10 then 100 then a max of 500

@circle-job-mirror
Copy link

circle-job-mirror bot commented Sep 30, 2024

@gtarpenning gtarpenning marked this pull request as ready for review September 30, 2024 23:09
@gtarpenning gtarpenning requested a review from a team as a code owner September 30, 2024 23:09
@gtarpenning gtarpenning enabled auto-merge (squash) September 30, 2024 23:50
@@ -2664,6 +2664,42 @@ def return_nested_object(nested_obj: NestedObject):
assert call_result.output == nested_ref.uri()


@pytest.mark.parametrize("batch_size", [1, 11, 101, 120])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why these numbers?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this tests batch of 10, 100, and 500. 120 because what if theres an off by one error and we are just getting lucky 🧠

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should add a comment -- these look like magic numbers otherwise. I would make the 10, 100, 500 consts and re-use them here with that comment

Copy link
Member

@gtarpenning gtarpenning Oct 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because of how the tests insert calls synchronously generating 500 calls actually gets pretty expensive, like 60+ seconds expensive. 120 completed in 15 seconds ish. I will add a comment

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah you dont have to do 500, just comments RE: what these numbers mean

# double batch size up to what refs_read_batch can handle
batch_size = min(max_size, batch_size * 2)
# *** Dynamic increase from 10 to 500 ***
batch_size = min(500, batch_size * 10)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's move this to a top level constant

@gtarpenning gtarpenning merged commit 0490554 into master Oct 1, 2024
77 checks passed
@gtarpenning gtarpenning deleted the tim/demo_of_error branch October 1, 2024 18:38
@github-actions github-actions bot locked and limited conversation to collaborators Oct 1, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants