-
Notifications
You must be signed in to change notification settings - Fork 27.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stream rendering to reduce TTFB and CPU load #1209
Comments
One thing to mention.. Stream rendering usually won't add much CPU improvements since the amount of work to be done is the same. But it'll reduce the response time. It's a pretty good idea to provide a way to customize the SSR rendering system. But I think for now, we'll stick with the React's renderToString() methods by default. This is something we could do after 2.0. |
Wouldn't streaming sensibly reduce memory allocation and CPU usage for large pages by being both asynchronous and partial? |
Thought this was along the same lines. Has anyone tried https://github.com/FormidableLabs/rapscallion ?
Other features from the docs:
|
This has been discussed in vercel#1334 and vercel#1209. I'm submitting this as a proof of concept for discussion, though as I'll explain below I do believe it triggers a need to redesign `document.js` The issue is that `document.js` uses `renderToString()` directly, which makes it difficult to replace. I hacked this by: **What I did here:** 1. creating a renderToParts() method, which in turn calls doRender with added configuration to allow for the replacement of `renderToString`, and keeping `renderToString` as the default. 2. Jettison `document.js`, and in the example `server.js` use Rapscallion templating to effectively do the same thing. 3. 🎉🎉🎉! **The main issue this raised:** * The subcomponents of `Document` (`Head`, `NextScript`) rely on context, and thus are difficult to interface with except through `Document`. I got around this by falling back to props when context was not defined. Hacky of course, and probably is the main thing we'd need to refactor to do this "right". **Results:** The test I created for examples was to generate a MD5 hash for the first 3000 integers. Since this is deterministic, I cached it with Rapscallion. `renderToString` continued to render it as normal on baseline. The test run was with apache benchmark, `ab -n 500 -c 10`. Baseline: ``` Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 0.4 0 4 Processing: 239 1293 246.1 1253 2099 Waiting: 238 1291 245.8 1251 2099 Total: 240 1293 246.1 1254 2100 Percentage of the requests served within a certain time (ms) 50% 1254 66% 1337 75% 1425 80% 1444 90% 1588 95% 1886 98% 1981 99% 1981 100% 2100 (longest request) ``` With Rapscallion: ``` Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 0.3 0 4 Processing: 97 204 39.6 190 318 Waiting: 97 202 38.6 189 318 Total: 98 205 39.6 191 319 Percentage of the requests served within a certain time (ms) 50% 191 66% 205 75% 224 80% 235 90% 269 95% 307 98% 315 99% 315 100% 319 (longest request) ``` ... or a roughly 6-7x improvement. Not super-surprising, since caching, but for us a larger portion of each page is static — being able to cache that, while leaving other areas to be dynamic, is a huge win. I played around with Rapscallion's streaming rendering as well, but I think that'll be most useful with something a little more real-world. Streaming seemed to have a small perf hit that gradually lessens the larger the CPU effort / payload becomes. If you were adding a bunch of CSS output in your header, for example, I think streaming would greatly improve load time. So yeah, open to suggestions here. I think this is a good stab at a general interface for getting "parts" for a more atomic Next render, but the API of that could certainly be better. Looking forward to hearing what you think! Love, - @gcpantazis + @thumbtack ❤️
This adds support for, and an example of, using Rapscallion (or potentially other alternate React renderers) with Next. This has been discussed in vercel#1334 and vercel#1209. I'm submitting this as a proof of concept for discussion, though the changes are fairly minimal. I'm hoping with some bit of feedback we can add this for real! The issue as mentioned by @arunoda elsewhere is that `document.js` uses `renderToString()` directly, which makes it difficult to inject an alternate renderer method. **What I did here:** 1. creating a `renderToParts()` method, which in turn calls doRender with added configuration to allow for the replacement of `renderToString`, and keeping `renderToString` as the default. 2. Jettison `document.js`, and in the example `server.js` use [Rapscallion templating](https://github.com/FormidableLabs/rapscallion#template) to effectively do the same thing. 3. 🎉🎉🎉! **The main issue this raised:** * The subcomponents of `Document` (`Head`, `NextScript`) rely on context, and thus are difficult to interface with except through `Document`. I got around this by falling back to props when context was not defined. Hacky of course, and probably is the main thing we'd need to refactor to do this "right". **Results:** The test I created for examples was to generate a MD5 hash for the first 3000 integers. Since this is deterministic, I cached it with Rapscallion. `renderToString` continued to render it as normal on baseline. The test run was with apache benchmark, `ab -n 500 -c 10`. Baseline: ``` Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 0.4 0 4 Processing: 239 1293 246.1 1253 2099 Waiting: 238 1291 245.8 1251 2099 Total: 240 1293 246.1 1254 2100 Percentage of the requests served within a certain time (ms) 50% 1254 66% 1337 75% 1425 80% 1444 90% 1588 95% 1886 98% 1981 99% 1981 100% 2100 (longest request) ``` With Rapscallion: ``` Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 0.3 0 4 Processing: 97 204 39.6 190 318 Waiting: 97 202 38.6 189 318 Total: 98 205 39.6 191 319 Percentage of the requests served within a certain time (ms) 50% 191 66% 205 75% 224 80% 235 90% 269 95% 307 98% 315 99% 315 100% 319 (longest request) ``` ... or a roughly 6-7x improvement. Not super-surprising, since caching, but for us a larger portion of each page is static — being able to cache that, while leaving other areas to be dynamic, is a huge win. I played around with Rapscallion's streaming rendering as well, but I think that'll be most useful with something a little more real-world. Streaming seemed to have a small perf hit that gradually lessens the larger the CPU effort / payload becomes. If you were adding a bunch of CSS output in your header, for example, I think streaming would greatly improve load time. So yeah, open to suggestions here. I think this is a good stab at a general interface for getting "parts" for a more atomic Next render, but the API of that could certainly be better. Looking forward to hearing what you think! Love, @gcpantazis + @thumbtack ❤️
This adds support for, and an example of, using Rapscallion (or potentially other alternate React renderers) with Next. This has been discussed in vercel#1334 and vercel#1209. I'm submitting this as a proof of concept for discussion, though the changes are fairly minimal. I'm hoping with some bit of feedback we can add this for real! The issue as mentioned by @arunoda elsewhere is that `document.js` uses `renderToString()` directly, which makes it difficult to inject an alternate renderer method. **What I did here:** 1. creating a `renderToParts()` method, which in turn calls doRender with added configuration to allow for the replacement of `renderToString`, and keeping `renderToString` as the default. 2. Jettison `document.js`, and in the example `server.js` use [Rapscallion templating](https://github.com/FormidableLabs/rapscallion#template) to effectively do the same thing. 3. 🎉🎉🎉! **The main issue this raised:** * The subcomponents of `Document` (`Head`, `NextScript`) rely on context, and thus are difficult to interface with except through `Document`. I got around this by falling back to props when context was not defined. Hacky of course, and probably is the main thing we'd need to refactor to do this "right". **Results:** The test I created for examples was to generate a MD5 hash for the first 3000 integers. Since this is deterministic, I cached it with Rapscallion. `renderToString` continued to render it as normal on baseline. The test run was with apache benchmark, `ab -n 500 -c 10`. Baseline: ``` Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 0.4 0 4 Processing: 239 1293 246.1 1253 2099 Waiting: 238 1291 245.8 1251 2099 Total: 240 1293 246.1 1254 2100 Percentage of the requests served within a certain time (ms) 50% 1254 66% 1337 75% 1425 80% 1444 90% 1588 95% 1886 98% 1981 99% 1981 100% 2100 (longest request) ``` With Rapscallion: ``` Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 0.3 0 4 Processing: 97 204 39.6 190 318 Waiting: 97 202 38.6 189 318 Total: 98 205 39.6 191 319 Percentage of the requests served within a certain time (ms) 50% 191 66% 205 75% 224 80% 235 90% 269 95% 307 98% 315 99% 315 100% 319 (longest request) ``` ... or a roughly 6-7x improvement. Not super-surprising, since caching, but for us a larger portion of each page is static — being able to cache that, while leaving other areas to be dynamic, is a huge win. I played around with Rapscallion's streaming rendering as well, but I think that'll be most useful with something a little more real-world. Streaming seemed to have a small perf hit that gradually lessens the larger the CPU effort / payload becomes. If you were adding a bunch of CSS output in your header, for example, I think streaming would greatly improve load time. So yeah, open to suggestions here. I think this is a good stab at a general interface for getting "parts" for a more atomic Next render, but the API of that could certainly be better. Looking forward to hearing what you think! Love, @gcpantazis + @thumbtack ❤️
Added an example of Rapscallion in #2279... can confirm that Rapscallion + Next is insane. Streamed/promise-based render is awesome, but Component-level caching is a game-changer for us... |
Now that React 16 has its own |
It's on our list of things to add already 👍 |
Any news about this? |
Any news? |
Next.js needs to expose custom I love everything about Next.js except two things:
|
any roadmap / plan of streaming rendering support? so expected have this in next.js . |
The is pending on the React team implementing React Fizz / their plan for it. |
@timneutkens What the issue, PR to track here? |
From Facebook's blog post, published on August 8th 2019
For anyone still waiting on server streaming support :) |
Is there any update or any other method to implement renderToNodeStream in next.js ? |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
@StarpTech I'd looked a bit into this (curious for this feature as well!) and it looks like the react team is working on something called react-flight, which will probably be the base for the streaming solution we are waiting for here :) react-flight:
The relevant PR's that shine some light on the inner workings, interpreted by me (not an expert in any of this 🙈 ) #17398 More recent PR, adds an api for Chunks so (if you're feeling lucky) you could try that part out yourself. Not sure how everything would come together but nevertheless I'm kinda happy with seeing all this work being done :) This might be slightly off-topic, but hopefully interesting for people subscribing to this issue :) |
@pepf thanks for the info! |
Hm. Thank u all guys, interesting info. I am just thinking why should NextJS wait for React support SSR for suspense and stuff, and not just use streamAsString now? |
@arunoda I think it will reduce memory consumption, very important for low memory lambda functions or Cloudflare Workers. |
Hi |
Yes any update? |
Still this: #1209 (comment) Streaming rendering will eventually be added once the React team published the new version of the server renderer, this is currently on the React experimental channel. |
If I'm not wrong, it seems it might come in React 18? |
Any update on this? |
@timneutkens Is there any way to fix this error. i am using next version 11.1.0. help me! |
Any update on this? |
With the launch of Next.js 12v I think this will come out of the box now: https://nextjs.org/blog/next-12#server-side-streaming Next.js 12 introduces a bunch of new performance features, including Suspense and SSR streaming and much more, so I think this issue can be marked as resolved? |
It is under an experimental flag, and requires React 18 which is in alpha. So I'd say it is far from stable. And I wouldn't close this issue until both are available as stable. |
any news? |
Any plans to add streaming to the older Pages routing? After inspecting the new App routing (its capabilities and especially restrictions), I cannot justify the switch for the app I'm working on. But having streaming working with the Pages routing would solve a few performance issues we currently have on mobile. |
The recently launched App Router supports streaming rendering by default, you can add @kostia1st we're currently not planning to add streaming to the Pages Router as we had to add specific APIs (i.e. metadata) in order to make early flushing work. Without Server Components you can't stream in data so you wouldn't get clear benefits from having it for Pages like you do in App Router. I'm going to close this issue now as it has been shipped 👍 |
This closed issue has been automatically locked because it had no new activity for a month. If you are running into a similar issue, please create a new issue with the steps to reproduce. Thank you. |
I suggest to stream-render
pages > 50KB
(hypothetical stream overhead) to reduce TTFB and CPU load.streamAsString
,streamAsStaticMarkup
,streamQueueAsString
,streamQueueAsStaticMarkup
with inferno-server (master). As of 1.2.1, the stream renderer was experimental and 10-15% slower thanrenderToString
. Should wait for 1.3 (currently RC3).renderToString
andrenderToStaticMarkup
.renderToStream
(SSR/renderToStream.js).It would supersede #767. Preact does not support streaming (preactjs/preact#23 (comment)).
The text was updated successfully, but these errors were encountered: