-
-
Notifications
You must be signed in to change notification settings - Fork 529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Creating ChatMessages is unreasonably slow #6064
Comments
@EllenOrange Agreed, totally unreasonable and I think I can squeeze out a 2-3x speed improvement pretty easily and then eventually work on a singular ChatMessage bokeh model which should make it possible to give us a 10x or more improvement. |
@philippjfr Sweet! If you have time, I'm curious what the root cause(s) are for this? For example, would would using a markdown pane and updating the string when new messages come in be a performant work around? |
I wasn't quite able to reproduce your timings, locally I'm seeing something closer to about 4 seconds and have been able to get that down to ~1.3 seconds. |
Nice! Also weird; my laptop is quite zippy. Maybe this touches on a way that WSL is slower than native Linux? In any case, thanks for looking into the issue. :) |
Applied some optimizations in PR to optimize |
I suspect that Philipp was using main, which included #6034 and optimized ChatReactionIcons used in ChatMessage |
Using
After
|
Were we fetching those in Python? |
Anyway, that's good news, we can claim a near 30x speedup over 1.3.4 :) |
You were correct @ahuang11, it was a problem with fetching the icons. Will wrap this PR up and then release. |
Will close, there's perhaps further optimizations to make but for now we've gone from ~30s -> 2.3s (1.3.5rc1) -> 1.3s (on current main). |
ALL software version info
WSL Ubuntu
Python 3.11.6
Panel 1.3.4
Bokeh 3.3.2
Description of expected behavior and the observed behavior
It seems as though creating a pn.chat.ChatMessage takes ~0.3 seconds per message (!). This means if I load a history with 100 messages, the load time goes up by nearly 30 seconds, which makes for an unreasonably slow development experience.
In the example code I create the objects manually and then set the .objects on the ChatFeed, but I've verified that this also occurs if I send the messages one by one.
It's very possible I'm missing something simple here, but I've followed the available instructions as closely as I can manage.
Complete, minimal, self-contained example code that reproduces the issue
Screenshots or screencasts of the bug in action
(test_app) elean@DESKTOP-290LANC:~/perforce/mainline/test_app$ cd /home/elean/perforce/mainline/test_app ; /usr/bin/env /home/elean/miniforge3/envs/test_app/bin/python /home/elean/.vscode-server/extensions/ms-python.python-2023.22.1/pythonFiles/lib/python/debugpy/adapter/../../debugpy/launcher 42127 -- -m panel serve app.py --show --autoreload
Create 100 messages = 29.158785282983445 seconds
2023-12-16 07:56:31,763 Starting Bokeh server version 3.3.2 (running on Tornado 6.4)
2023-12-16 07:56:31,764 User authentication hooks NOT provided (default user enabled)
2023-12-16 07:56:31,766 Bokeh app running at: http://localhost:5006/app
2023-12-16 07:56:31,766 Starting Bokeh server with process id: 612354
For completeness, I just ran the same case outside of debugpy and got:
$ panel serve app.py --show --autoreload
Create 100 messages = 16.05742013201234 seconds
2023-12-16 08:10:38,523 Starting Bokeh server version 3.3.2 (running on Tornado 6.4)
2023-12-16 08:10:38,523 User authentication hooks NOT provided (default user enabled)
2023-12-16 08:10:38,524 Bokeh app running at: http://localhost:5006/app
2023-12-16 08:10:38,524 Starting Bokeh server with process id: 615103
Which is faster but still pretty unreasonable. :)
The text was updated successfully, but these errors were encountered: