Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SDK stops working throwing System.TimeoutException for all requests after random time #710

Closed
4 tasks done
ChristianGalla opened this issue Nov 17, 2020 · 5 comments
Closed
4 tasks done
Assignees
Labels

Comments

@ChristianGalla
Copy link

ChristianGalla commented Nov 17, 2020

Description of the Issue

We are using the SDK in a long running console application to upload many files to Box. Many copies of the application are running at the same time.

After some random time, it is not possible to download files anymore and all calls of BoxClient.FilesManager.DownloadAsync are throwing System.TimeoutException.

Other application instances are not directly affected but may fail at another time.

Steps to Reproduce

Randomly after some hours, days or even weeks the API stops working and all calls of BoxClient.FilesManager.DownloadAsync are throwing System.TimeoutException.

This happens even for small csv files. It is possible to access the files via Box website even between two calls of this function.

Without success I have tried to retry calls to this method after some seconds delay. Even manual recreation of the Box session did not help. Reconnecting to Box is not throwing an error.

The only workaround I have found is to restart the application.

Expected Behavior

The SDK keeps working.

Error Message, Including Stack Trace

System.TimeoutException: Request timed out ---> System.Threading.Tasks.TaskCanceledException: A task was canceled.
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Box.V2.Request.HttpRequestHandler.<getResponse>d__7.MoveNext()
   --- End of inner exception stack trace ---
   at Box.V2.Request.HttpRequestHandler.<getResponse>d__7.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Box.V2.Request.HttpRequestHandler.<ExecuteAsync>d__5`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Box.V2.Services.BoxService.<ToResponseAsync>d__5`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Box.V2.Managers.BoxResourceManager.<ExecuteRequest>d__12`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Box.V2.Managers.BoxResourceManager.<ToResponseAsync>d__11`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Box.V2.Managers.BoxFilesManager.<DownloadAsync>d__3.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)

Possible cause of this issue

I had a look at the implementation of the SDK and I think this looks like a connection pool starvation of the HttpClient.

My guess is a missing dispose of a failed HttpResponseMessage in HttpRequestHandler.ExecuteAsync before the request is retried (not tested).

Versions Used

.Net SDK: 3.24.0
Windows: Server 2012 R2

@ChristianGalla
Copy link
Author

We have increased the connection pool size in the app.config of our application and it looks like this is preventing this issue. But I think this should only be a temporary workaround because this may have other side effects like increased memory usage.

<system.net>
    <connectionManagement>
      <add address="*" maxconnection="500" />
    </connectionManagement>
  </system.net>

@sujaygarlanka
Copy link
Contributor

@ChristianGalla Thanks for reporting the bug. We will take a look at it and respond ASAP.

@keithgagne
Copy link

I am facing this same issue. I implemented the suggested workaround but the issue continues. I am curious if anyone has additional details or an alternate workaround.

@fullstackstorm
Copy link

I'm using .Net Core 3.1 and have the same problem. I get a TimeoutException that has the following three inner exceptions: ioexception, taskcancelledexception, socketexception.

How do I set maxconnection in .Net Core? In fact, is there a programmatic way to do it?

@mwwoda
Copy link
Contributor

mwwoda commented May 4, 2022

Hi all,
Sorry for such a late reply. A fix for this bug has been merged into the main branch and will be released with the next package. It was indeed caused by connection pool starvation due to lack of dispose on response. Thanks to @ChristianGalla for pointing this out!

If you want to further increase the number of possible connections, you can do the following:
.NET Framework - change the value of ServicePointManager.DefaultConnectionLimit e.g. ServicePointManager.DefaultConnectionLimit = 20.
.NET Core - The connection pool should scale automatically as needed.

Thanks,
Mateusz

@mwwoda mwwoda closed this as completed May 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants