Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

large_unary interop test timing out against most grpc clients #149

Closed
jtattermusch opened this issue Mar 7, 2019 · 10 comments · Fixed by #154
Closed

large_unary interop test timing out against most grpc clients #149

jtattermusch opened this issue Mar 7, 2019 · 10 comments · Fixed by #154
Labels
bug Something isn't working

Comments

@jtattermusch jtattermusch added the bug Something isn't working label Mar 7, 2019
@jtattermusch
Copy link
Contributor Author

jtattermusch commented Mar 8, 2019

It looks like ReadSingleMessageAsync gets stuck forever when reading a large unary request.

@jtattermusch
Copy link
Contributor Author

CC @JunTaoLuo

@jtattermusch
Copy link
Contributor Author

jtattermusch commented Mar 8, 2019

What I'm seeing is that ReadAsync() reads 98304 bytes in the first invocation and never returns from the second invocation.

The message sent by the interop client is slightly bigger than 271828 bytes.
https://github.com/grpc/grpc/blob/master/doc/interop-test-descriptions.md#large_unary

@jtattermusch
Copy link
Contributor Author

I bisected the breakage down to
851e231 (which is basically just upgrade to preview 3)

So there are two options:

  • either the usage pattern HttpContext's PipeReader of preview3 has changed
  • there's a bug in preview3 that breaks our code
851e23101dd472703fd7e6caa91aeffa13f041ec is the first bad commit
commit 851e23101dd472703fd7e6caa91aeffa13f041ec
Author: Jan Tattermusch <[email protected]>
Date:   Thu Mar 7 09:36:18 2019 +0100

    Use .NET Core 3 preview 3 (#146)
    
    * bump to .NET Core 3 preview 3
    
    * use Grpc 1.19.0 nugets
    
    * nits in test runner scripts

:040000 040000 1ceea3b717fa44ccdf2d95fda7a259dfcf40d0dd ee9ed0e3caf0f016f2a218fdc9ed1d3c2c2f8e9b M	build
:100755 100755 8ae1db70c98764cbd9d080abb72b127c194bcc8e dcb1403eed4f7e3401925becf1942436686c2a73 M	build_and_test.sh
:100644 100644 f660a138ee650ca393d9a591c37cc168de911ddc 433a84f36505e22f8a8c2468b9f1524ac7cf906c M	global.json
:040000 040000 f62ee5d71e071600534d87781347de92af56965f b84a25afb89d9887afa2b5397fd289d57a712e4c M	kokoro

@davidfowl
Copy link
Contributor

cc @jkotalik, likely something to do with your changes.

@jkotalik
Copy link

jkotalik commented Mar 8, 2019

dotnet/aspnetcore#8200 may fix it. Other than that, I'm not sure and would need a simple repro.

@jkotalik
Copy link

jkotalik commented Mar 8, 2019

Seems like just calling ReadAsync(100kb) and then calling Read again should cause it to repro?

@JamesNK
Copy link
Member

JamesNK commented Mar 8, 2019

Ping us once dotnet/aspnetcore#8200 is merged and we'll update to use a nightly SDK build. We'll test whether it still effects us.

@JunTaoLuo
Copy link
Contributor

JunTaoLuo commented Mar 8, 2019

I think this is actually the stream limit being hit. https://docs.microsoft.com/en-us/dotnet/api/microsoft.aspnetcore.server.kestrel.core.http2limits.initialstreamwindowsize?view=aspnetcore-2.2#Microsoft_AspNetCore_Server_Kestrel_Core_Http2Limits_InitialStreamWindowSize.

I was able to create a repro locally with a 100kb message and "fix" the problem by setting InitialStreamWindowSize to int.MaxValue. @jtattermusch can you try that and see if the interop tests still fail? Note that you'll probably need to also set InitialConnectionWindowSize if you are going above 128kb

@JunTaoLuo
Copy link
Contributor

To clarify dotnet/aspnetcore#8200 should fix the behaviour in the long term but we can try setting InitialStreamWindowSize and InitialConnectionWindowSize for now as a workaround.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants