-
Notifications
You must be signed in to change notification settings - Fork 30.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GC regression in v12.5.0 #28722
Comments
Here's an example that demonstrates the issue: const bigData = [ ]
const startDate = new Date()
while (true) {
const someData = new Set()
for (let i = 0; i < 128; i++) {
someData.add(i)
}
bigData.push(someData)
if (bigData.length % 1000 === 0) {
process.stdout.write([
`\r${bigData.length.toLocaleString()}`,
`@`,
`${Math.floor(process.memoryUsage().heapUsed / (1024 * 1024)).toLocaleString()}MB`,
`+`,
`${((new Date()) - startDate).toLocaleString()}ms`
].join(' '))
}
} Node v11.15.0: note how fast it is I killed it around 6.6m / 16.8GB - you'll have to trust me that this runs all the way to the 40GB limit and dies.
Node v12.6.0: note the duration (the speed of iteration remains consistent throughout)
|
Oh, here are the tests without any memory flags. v12.6.0: (it stutters significantly as it runs out of memory, crawls along for a few more seconds and explodes):
v11.15.0 (slams into the limit pretty hard and explodes):
|
...sorry the timings differ a bit. The regression was introduced in v12.5.0 v12.1.0
v12.2.0
v12.3.0
v12.4.0
v12.5.0 (this is where the performance degraded)
v12.6.0
|
https://github.com/nodejs/node/releases/tag/v12.5.0
... @psmarshall I don't suppose you're still involved with V8 Garbage Collection? |
Hmmm... I'm wondering if this is the same issue I'm seeing... using valgrind with See: https://gist.github.com/jasnell/001808188c36a9936bdbddb69817e84f |
@joyeecheung @targos ... there's definitely something wrong happening. Valgrind is showing a massive number of issues on latest master. I haven't had a chance to bisect the issues, but look at the gist above to see where the failures are happening. |
ping @nodejs/v8 @nodejs/v8-update |
Could be related to #28558 @hashseed The warnings in https://gist.github.com/jasnell/001808188c36a9936bdbddb69817e84f seem to mostly about using uninitialized memory within |
Potentially related, yes, but just to be certain, building with the |
No, it doesn't happen with d8. Looking it over, it potentially appears like one of the allocators is not being initialized appropriately. Still investigating tho. |
Working on bisecting the commits. Build... Test... Bisect... Build... Test... Bisect... going to take a while. |
@jasnell, if it can help, I could narrow down the range of commits to this: 9bead0c...779a243 |
Both of those would make sense. All of the failures appear to originate in the random number generation for EntropySource. Everything else seems to snowball out from there. |
/cc @sam-github |
Is there still a chance that this is related to the original issue or should we open a new one? |
It's really difficult to say. I'm going to try reverting the three commits from that PR and see where we end up. |
I can definitely confirm that reverting that PR resolves the flood of valgrind issues I'm seeing. I'll open a PR that folks can experiment with to see if it resolves the original issue discussed here. |
PR here: #28739 |
I've pulled down @jasnell branch, built it and tested it - bad news I'm afraid:
The performance issue is still present, and it still runs out of memory. |
We recently switched from using 12.4.0 to 12.7.0 and have seen changes in our GC profile. Our 95th GC pause went from 8ms in 12.4.0 to 15ms in 12.7.0 under a similar workload. @theninj4 were you able to track down anything related to the V8 upgrade that may point to the root cause? |
I haven't, no. The behaviour I'm seeing may well be more of a feature than a regression - I very much doubt my use case of rapidly loading tens of GB of data onto the heap is one that the V8 team are interested in. I've stopped loading my huge data set onto the heap and I've moved it to Redis, where I found 12.5.0 is more performant than 12.4.0 at marshalling the data in+out of Redis. |
The latest changes in v12.9.0 specifically around JSON parse performance have been amazing. As such, it's now bizarrely more performant to shuffle all my data in/out of Redis than it is to try and wrestle with the heap + garbage collection. I'll close this issue 👍 |
I'm working on some "big data" type stuff and figured I'd have a crack at it with Node.js. I'm effectively streaming a huge data set from disk into memory - millions of Set()s holding hundreds of Numbers. I'm starting Node.js as follows (that's not a typo, 142GB):
Now, I've been using
v11.15.0
for a while, and this all works remarkably well all things considered. I reinstalled my server last week and ended up with the latest v12 release - I'd heard there'd been some changes to how the memory side of things works and figured I'd give it a whirl. I'm using the same flags as v11, the identical command as above.First up, v12 runs around 4 times slower than v11 when loading my data set into memory. Secondly, when I pass ~13GB of heap usage (as reported by
process.memoryUsage().heapUsed
) the process prints the following message and dies:Let me know if I can help in any way.
The text was updated successfully, but these errors were encountered: