-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large terms database generating too many WP API request on search #20734
Comments
Howdy @mgrenierfarmmedia ! I was able to reproduce and indeed there are tons of useless requests, especially at the 1st letter stage when we don't even display anything at all. Maybe we could introduce some "endless scrolling" to that list of suggestions and take each page of results at a time? In either case it also worth considering how often these thousands taxonomy items are to assess the best solution. |
@draganescu I would suggest starting by the minLenght option as a hotfix, moving to the endless scrolling as a feature. It already search and order the search results based on the count in the term, hence the first 100 are the most likely to be accurate. |
Looks like adding minLength option should be easy enough by adding if ( search.length < minLength ) {
return;
} This |
Yes I've also noticed recently that |
I can't work out how that abort method is supposed to be handled looking at the current code base. @youknowriad , I see you did some of the work on the api-fetch package, do you know how Out of interests I had a play around with using an AbortController to handle this which can just be passed through in the fetch options ... it seems to work pretty reliably, but would need to be polyfilled for IE - this is just a PoC PR for my benefit mostly, I am sure there must be some way of cancelling requests already in Core, but I just can't see it for looking! |
Hey all, |
howdy @mgrenierfarmmedia I don't think it is fixed. |
Hi @draganescu, This is blocking us (a lot of publications WordPress), and we would really like to be using Gutenberg. But this call rate is humongous, starting at a specific minLenght (3 as it used to be) would reduce to 90% the number of request. I see no Handle, dispatch or way to overriding this except by working in the core itself. Do you know when this could be integrated? Or how I could help prioritize this? Best regards, |
Howdy @mgrenierfarmmedia and thank you for working on this. The fastest way to speed things up at this stage is to try a Pull Request with a proposed solution. If that is not possible, try showing up in one of the core editor meetings or even just leave a comment on the next agenda, next one will be published next week, explaining why this is important and, even better, a proposed solution. Other than that, it is one of the 538 Bugs open and it will be attended to the more it gets closer to being triaged, or in a bug scrub or picked up by someone. |
@draganescu I did the base code to have this stops. Pretty sure it is alrgiht and PR is up. |
Summary
Searching for a term (post_tag) in a post's edition with a large database might generate a lot of requests, to get all terms including the current search. Since the API is working with 100 results max per_page, it will keep calling until it have all the terms.
These calls aren't stopping, even if the search changes.
Example: You have 20k post_tags, 11k contains a "e", you start a search for "extra", it will start a search for "e", This will generates 110 request, one after an other. You then enter "ex", say it calls 47 pages of results, concurrent with the initial 110. "ext"generates 24 pages, "extr" generates 5 calls, extra generates 1.
This just spammed the server for 196 requests.
To reproduce
Steps to reproduce the behavior:
Expected behavior
Desktop
The text was updated successfully, but these errors were encountered: