Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Got too many pings from the client, closing the connection in TiDB server log files #5688

Closed
jackysp opened this issue Aug 24, 2022 · 6 comments · Fixed by #5528
Closed

Got too many pings from the client, closing the connection in TiDB server log files #5688

jackysp opened this issue Aug 24, 2022 · 6 comments · Fixed by #5528
Labels
component/compute type/bug The issue is confirmed as a bug.

Comments

@jackysp
Copy link
Member

jackysp commented Aug 24, 2022

Bug Report

Please answer these questions before submitting your issue. Thanks!

See pingcap/tidb#36861 for more details.

1. Minimal reproduce step (Required)

2. What did you expect to see? (Required)

3. What did you see instead (Required)

4. What is your TiFlash version? (Required)

@jackysp jackysp added the type/bug The issue is confirmed as a bug. label Aug 24, 2022
@windtalker
Copy link
Contributor

I'm not sure if the Got too many pings from the client error comes from the connection between TiDB and TiFlash, because according to the related document:

  • Why is my client not sending keepalive pings even after configuring GRPC_ARG_KEEPALIVE_TIME_MS and GRPC_ARG_KEEPALIVE_TIMEOUT_MS?
    • This can happen in the following cases -
      • There are no RPCs in flight and GRPC_ARG_KEEPALIVE_PERMIT_WITHOUT_CALLS has not been set to 1(defaults to 0). If we require the endpoint to be able to send pings even when there are no ongoing RPCs, GRPC_ARG_KEEPALIVE_PERMIT_WITHOUT_CALLS should be set to 1 as documented above.
      • When there isn't any data/header being sent on the transport, gRPC clients restrict the number of pings to 2 by default. Setting GRPC_ARG_HTTP2_MAX_PINGS_WITHOUT_DATA to 0 will remove this limit.

It looks like according to the explain above, too many pings usually happens when client set GRPC_ARG_KEEPALIVE_PERMIT_WITHOUT_CALLS to true and server set GRPC_ARG_KEEPALIVE_PERMIT_WITHOUT_CALLS to false, but TiFlash already set GRPC_ARG_KEEPALIVE_PERMIT_WITHOUT_CALLS to true.

@jackysp
Copy link
Member Author

jackysp commented Aug 26, 2022

I think TiDB set it to false.

@windtalker
Copy link
Contributor

At least in current master codebase, TiDB still set it to true:
https://github.com/tikv/client-go/blob/master/internal/client/client.go#L194-L197

And since TiDB is just the client side, even if it set to false, it should not causing the too many pings error.

@jackysp
Copy link
Member Author

jackysp commented Aug 26, 2022

Yes, TiDB is the client side in client-go. I mean is it possible that TiFlash is the client side and trying to get some info, maybe table schemas, from TiDB server?

@windtalker
Copy link
Contributor

Yes, TiDB is the client side in client-go. I mean is it possible that TiFlash is the client side and trying to get some info, maybe table schemas, from TiDB server?

AFAIK, TiFlash does not communicate with TiDB as a grpc client, for table schema, TiFlash get it from TiKV. Even if TiFlash communicate with TiDB as a grpc client, TiFlash use client-c as its' client and in client-c, GRPC_ARG_KEEPALIVE_PERMIT_WITHOUT_CALLS is not set exlicitly, and the default value I think is false.
https://github.com/tikv/client-c/blob/master/include/pingcap/kv/internal/conn.h#L21-L24

@jackysp
Copy link
Member Author

jackysp commented Aug 29, 2022

Do you mean TiFlash uses client-c of TiKV to communicate with TiDB?!
Well, I think I shouldn't guess at some of the modules I'm not too familiar with. I'm going to just keep the issue in the tidb repo and close this one.

@jackysp jackysp closed this as completed Aug 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component/compute type/bug The issue is confirmed as a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants