Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] netty memory leaks in TransportResponseHandler #253

Closed
ghost opened this issue Jul 12, 2022 · 0 comments
Closed

[BUG] netty memory leaks in TransportResponseHandler #253

ghost opened this issue Jul 12, 2022 · 0 comments
Labels
bug Something isn't working

Comments

@ghost
Copy link

ghost commented Jul 12, 2022

What is the bug?

In early testing (May 2022), I found that there are memory leaks in some cases

How to reproduce the bug?

Could you share logs or screenshots?

22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86763 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86771 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86790 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86793 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86808 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86818 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86826 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86836 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86844 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86859 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86867 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86876 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86886 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86892 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86904 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86929 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86938 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86948 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86949 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86967 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86971 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86983 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 86991 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 87008 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 WARN TransportResponseHandler: Ignoring response for RPC 87017 from /[10.70.0.72:41015](http://10.70.0.72:41015/) (0 bytes) since it is not outstanding
22/05/05 14:42:39 ERROR ResourceLeakDetector: LEAK: ByteBuf.release() was not called before it's garbage-collected. See https://netty.io/wiki/reference-counted-objects.html for more information.
Recent access records: 
Created at:
        com.aliyun.emr.io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:401)
        com.aliyun.emr.io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:188)
        com.aliyun.emr.io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:179)
        com.aliyun.emr.io.netty.buffer.AbstractByteBufAllocator.ioBuffer(AbstractByteBufAllocator.java:140)
        com.aliyun.emr.io.netty.channel.DefaultMaxMessagesRecvByteBufAllocator$MaxMessageHandle.allocate(DefaultMaxMessagesRecvByteBufAllocator.java:120)
        com.aliyun.emr.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:150)
        com.aliyun.emr.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
        com.aliyun.emr.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
        com.aliyun.emr.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
        com.aliyun.emr.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
        com.aliyun.emr.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
        com.aliyun.emr.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        com.aliyun.emr.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        java.lang.Thread.run(Thread.java:750)
22/05/05 14:42:40 INFO Executor: Finished task 0.0 in stage 13.0 (TID 58). 5377 bytes result sent to driver
22/05/05 14:42:40 INFO YarnCoarseGrainedExecutorBackend: Got assigned task 76
22/05/05 14:42:40 INFO Executor: Running task 18.0 in stage 13.0 (TID 76)
22/05/05 14:42:40 INFO RssInputStream: Moved to next partition PartitionLocation[118-0 [10.70.0.68:42876](http://10.70.0.68:42876/):46843:46214:42803 Mode: Master peer: ],startMapIndex 0 endMapIndex 2147483647 , 0/1 read , get chunks size 1
22/05/05 14:42:40 INFO RssInputStream: Moved to next partition PartitionLocation[119-0 [10.70.0.72:32821](http://10.70.0.72:32821/):41015:33491:46135 Mode: Master peer: ],startMapIndex 0 endMapIndex 2147483647 , 0/1 read , get chunks size 1
22/05/05 14:42:40 INFO RssInputStream: Moved to next partition PartitionLocation[120-0 [10.70.0.71:39065](http://10.70.0.71:39065/):33606:41873:39543 Mode: Master peer: ],startMapIndex 0 endMapIndex 2147483647 , 0/1 read , get chunks size 1
22/05/05 14:42:40 INFO RssInputStream: Moved to next partition PartitionLocation[121-0 [10.70.0.68:42876](http://10.70.0.68:42876/):46843:46214:42803 Mode: Master peer: ],startMapIndex 0 endMapIndex 2147483647 , 0/1 read , get chunks size 1
22/05/05 14:42:40 INFO TransportClientFactory: Successfully created connection to /[10.70.0.72:33491](http://10.70.0.72:33491/) after 0 ms (0 ms spent in bootstraps)

/cc @who-need-to-know

/assign @who-can-solve-this-bug

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

0 participants