-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UI unresponsive when using insert batching #257
Comments
I love this idea. Another solution to the problem would be:
|
Yeah that might be the easier and better route to take. |
Hi, So if I understand correctly the problem is that the per-database view can get too slow if there are a lot of very long query text (which happens in your case because you batch inserts in multi-valued statements). I think that the problem actually comes from the javascript library that formats the query text. Is the per-query view performance normal? If yes we could simply just emit a truncated SQL text in the per-database view, as the full query text is only available on hover of each query, which isn't going to be helpful for over lengthy queries anyway. |
The per query view loads nearly instantaneous for a batch of 1000. Additionally the per-database view also works fine if the batch size is reduced to 1. A quick check on the API endpoint The 30s download time could be a separate issue but once that data is loaded the UI grinds to a halt when searching or sorting which do not appear to be additional API requests. |
The 30s indeed seems long to download the data, but the UI halting is likely due to the JS prettifying the SQL queries. If you know how to patch you local powa-web server you could try this patch as a quick POC limiting the query text i nthe per-database view diff --git a/powa/database.py b/powa/database.py
index 368a795..cc7edb8 100644
--- a/powa/database.py
+++ b/powa/database.py
@@ -845,7 +845,7 @@ class ByQueryMetricGroup(MetricGroupDef):
cols = [
"sub.srvid",
"sub.queryid",
- "ps.query",
+ "CASE WHEN length(ps.query) > 100 THEN substr(ps.query, 1, 100) ELSE ps.query END AS query",
"sum(sub.calls) AS calls",
"sum(sub.runtime) AS runtime",
mulblock("shared_blks_read", fn="sum"), That should be enough to validate the JS problem. |
That worked! The API request took less than a second and the response size was reduced to 617KB. Obviously highlighting shows the truncated query but that should be expected. I also bumped the size as a test: 1000: Worked fine Reducing from 250000 characters sure makes a difference! 😜 |
great news, thanks a lot for testing! I will work on a real patch for that, maybe make the limit configurable. I tried to add some trailing |
Hello,
I've got PoWA running locally to Postgres and as part of some performance enhancements to our application we've implemented batching of inserts. Batching is setup in such a way that we batch up to 1000 records or wait 5 seconds to force a batch. The majority batches are 1000 records but given the time element we could have the same query repeated up to 1000 times. This is causing the Web UI to lock up and be unusable.
Perhaps there is a simple solution here but in the event there isn't wanted to pitch an idea. Rather than sending the query over the the
/database_all_queries
API response could only the query ID and a reference to a pre-generated/cached image of the query be included? The WebUI would show the image by default and then hovering or maybe clicking some button could get the individual query from another API endpoint. Sure an image is more data over the wire but it would reduce CPU usage on the client.Example insert:
PoWA Web Version 5.0.1
Thanks 😄
The text was updated successfully, but these errors were encountered: