-
Notifications
You must be signed in to change notification settings - Fork 3.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LEAK: ByteBuf.release() was not called before it's garbage-collected #9563
Comments
I don't see anything obviously wrong. |
Potentially related: #9340 |
@sanjaypujare RPCs do timeout. we do see sporadic timeouts and see the error message inside the catch block |
Just trying to clarify: you mean you have |
@sanjaypujare |
Transparent retries are enabled by default (AFAIK). Pls disable retry and see. |
|
Yes, startup of any app in Java can cause sluggishness so this is to be expected. If this resolves your problem you can close this issue. Another thing to verify is whether the fix in #9360 is working for you or not. |
Disabling retrying "working" is a workaround. We're happy that there is a workaround, but it seems this is now more clearly a bug triggered by retry. Something in my previous workaround wasn't enough, since 1.49.1 included the workaround. My "fix" was for the writing direction. This issue looks to be the read path based on leak_record.txt. |
@sanjaypujare |
Hi, is this issue solved? We observed the same LEAK message recently, and after turning off retry, it disappears. |
This issue is still open/unresolved. |
This prevents leaking message buffers. Fixes grpc#9563
This prevents leaking message buffers. Fixes #9563
This prevents leaking message buffers. Fixes grpc#9563
This prevents leaking message buffers. Fixes grpc#9563
This prevents leaking message buffers. Fixes grpc#9563
This prevents leaking message buffers. Fixes #9563
This prevents leaking message buffers. Fixes #9563
This prevents leaking message buffers. Fixes #9563
This prevents leaking message buffers. Fixes grpc#9563
This prevents leaking message buffers. Fixes grpc#9563
This prevents leaking message buffers. Fixes grpc#9563
This prevents leaking message buffers. Fixes #9563
What version of gRPC-Java are you using?
1.49.1
What is your environment?
JDK 17 , linux
What did you expect to see?
NO LEAK DETECTION message
What did you see instead?
Here's the complete exception trace.
leak_record.txt
Steps to reproduce the bug
We have a gRPC service which calls another gRPC service
We found the leak detection message few mins after server starts up and requests are sent to the service.
I turned on the leak detection to
paranoid
by setting-Dio.grpc.netty.shaded.io.leakDetection.level=paranoid
. We immediately started to see lot of LEAK message.We suspected its something to do with setting a deadline
Here's a sample snippet
We removed the deadline and with paranoid settings, we dont see any error message anymore.
Is there anything wrong with the way the deadline is handled in the above scenario or is this a bug?
The text was updated successfully, but these errors were encountered: