Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple files upload hangs the process in a blazor server app #47301

Open
1 task done
sabrite opened this issue Mar 19, 2023 · 10 comments
Open
1 task done

Multiple files upload hangs the process in a blazor server app #47301

sabrite opened this issue Mar 19, 2023 · 10 comments
Labels
area-blazor Includes: Blazor, Razor Components bug This issue describes a behavior which is not expected - a bug. Priority:3 Work that is nice to have

Comments

@sabrite
Copy link

sabrite commented Mar 19, 2023

Is there an existing issue for this?

  • I have searched the existing issues

Describe the bug

I have a Blazor Server app which is calling an API to upload multiple images. With, one or more images selected using InputFile control, it always works:

  1. When a single file is uploaded, no matter what the size is (I am trying images of up to 10MB atm)
  2. When two or more images are selected and total size of all images is less than 1MB

And it doesnt work:

  1. Two or more images are selected, with each image size greater than or equal to 2MB. The request made through HttpClient just dies away (it never reaches the API Controller).

Here is the code for anyone who wants to quickly reproduce the issue:

Index.Razor

@page "/"
@using System.Net.Http.Headers
@inject HttpClient _httpClient

<PageTitle>Index</PageTitle>

<InputFile class="custom-file-input" OnChange="@LoadFiles" multiple="multiple" />
<button class="btn btn-primary" type="button" @onclick="UploadFiles">Upload</button>

@code{
    private IReadOnlyList<IBrowserFile>? _images;

    private async Task LoadFiles(InputFileChangeEventArgs e)
    {
        _images = e.GetMultipleFiles();
    }

    private async Task UploadFiles()
    {
        const long maxFileSize = 1024 * 1024 * 300;
        _httpClient.Timeout = TimeSpan.FromMinutes(10);
        var uri = "https://localhost:7112/Files";
        using var content = new MultipartFormDataContent();

        foreach (var file in _images)
        {
            var fileContent = new StreamContent(file.OpenReadStream(maxFileSize));
            fileContent.Headers.ContentType = new MediaTypeHeaderValue(file.ContentType);
            content.Add(fileContent, "files", file.Name);
        }

        var response = await _httpClient.PostAsync(uri, content);

    }

}

and here is the controller:

[HttpPost]
        public void Upload([FromForm] IEnumerable<IFormFile> files)
        {
            var count = files.Count();
            //Do something with files
        }`

In the API project, I have tried doing this:
`builder.WebHost.UseKestrel(options =>
{
    options.Limits.MaxRequestBodySize = null;
});

and in the Blazor Server project, tried this:

builder.Services.AddServerSideBlazor(options =>
{
    options.DetailedErrors = true;
    options.DisconnectedCircuitMaxRetained = 100;
    options.DisconnectedCircuitRetentionPeriod = TimeSpan.FromMinutes(3);
    options.JSInteropDefaultCallTimeout = TimeSpan.FromMinutes(10);
    options.MaxBufferedUnacknowledgedRenderBatches = 10;
}).AddHubOptions(options =>
{
    options.ClientTimeoutInterval = TimeSpan.FromMinutes(10);
    options.EnableDetailedErrors = true;
    options.HandshakeTimeout = TimeSpan.FromSeconds(15);
    options.KeepAliveInterval = TimeSpan.FromSeconds(15);
    options.MaximumParallelInvocationsPerClient = 1;
    options.MaximumReceiveMessageSize = 32 * 1024 * 1024;
    options.StreamBufferCapacity = 33554432;
});

Nothing seems to work. Not sure it is a bug or I am missing something very obvious. I appreciate any help. Thanks,

Expected Behavior

All files should be successfully posted to backend API, no matter what the size of each file is.

Steps To Reproduce

No response

Exceptions (if any)

No response

.NET Version

.NET 7

Anything else?

No response

@dotnet-issue-labeler dotnet-issue-labeler bot added the area-blazor Includes: Blazor, Razor Components label Mar 19, 2023
@MackinnonBuck
Copy link
Member

MackinnonBuck commented Mar 20, 2023

Thanks for reaching out, @sabrite. Could you please provide server-side logs to help us determine what's going on here (preferably trace-level) ASP.NET Core Logging and provide the logs from when the issue occurs?

@MackinnonBuck MackinnonBuck added the Needs: Author Feedback The author of this issue needs to respond in order for us to continue investigating this issue. label Mar 20, 2023
@ghost
Copy link

ghost commented Mar 20, 2023

Hi @sabrite. We have added the "Needs: Author Feedback" label to this issue, which indicates that we have an open question for you before we can take further action. This issue will be closed automatically in 7 days if we do not hear back from you by then - please feel free to re-open it if you come back to this issue after that time.

@sabrite
Copy link
Author

sabrite commented Mar 20, 2023

Thanks @MackinnonBuck - There is nothing in the logs whatsoever. The only thing that is recorded by Blazor Server app is the http request it is sending to the API. I cant see anything with tracing enabled and even using IntelliTrace.

I have created a repo with bare minimum code. You should be able to reproduce this by "Uploading 3 or more files, each of 2MB or more". Here is the repo https://github.com/sabrite/BlazorFileUpload

I will look forward to your inputs. Thanks,

@ghost ghost added Needs: Attention 👋 This issue needs the attention of a contributor, typically because the OP has provided an update. and removed Needs: Author Feedback The author of this issue needs to respond in order for us to continue investigating this issue. labels Mar 20, 2023
@mkArtakMSFT mkArtakMSFT added investigate and removed Needs: Attention 👋 This issue needs the attention of a contributor, typically because the OP has provided an update. labels Mar 21, 2023
@sabrite
Copy link
Author

sabrite commented Mar 22, 2023

@SteveSandersonMS @danroth27 can someone please look into this?

@SteveSandersonMS
Copy link
Member

I've tracked this down to a defect in the backpressure mechanism with JS streams.

Workaround

One workaround is simply to read the whole file into a byte[] memory on the .NET side before constructing the HTTP request content. The key thing is that each time you call OpenReadStream, you have to read that entire file before you call OpenReadStream on the next file. Reading multiple streams in parallel within a single user circuit is what causes the problem.

Another workaround is to change the SignalR MaximumParallelInvocationsPerClient setting to a number equal or greater than the largest number of concurrent uploads you want to be able to handle. For example, to make two parallel uploads work within the same circuit, set MaximumParallelInvocationsPerClient = 2. By default the value is 1.

Reason

The backpressure mechanism is having a Pipe, with the JS side sending calls that write to it, and the .NET side doing stream operations that read from it. That's great except if the .NET side chooses not to do read operations from it, at which point the pipe writer will fill up.

  • Our expectation was that this will cause the next "write" call from the JS side to wait asynchronously, but everything else to continue unaffected.
  • We knew that SignalR only allows a single incoming call per circuit by default, but we thought that was OK because it's expected that the pipe reader will be processing the pipe and hence freeing things up
  • But it goes wrong in a sequence like this:
    • JS supplies content for streams A and B. It tries to supply content for both of them, so that the .NET side could read them in parallel if it wants.
    • However in the HttpRequestMessage case, .NET side wants to read all of stream A before it will start reading stream B
    • At some point, the pipe for stream B fills up, so the next write for stream B will asynchronously pause
    • Unfortunately that now means we can no longer receive any writes for stream A (because SignalR won't accept another concurrent call on this hub until the write call for stream B completes)
    • Eventually the reader for stream A will run out of data and asynchronously pause until it gets more writes for stream A, which is now never going to happen

The problem goes away if SignalR accepts 2 concurrent calls within the circuit, because even though the write call for stream B is blocked, the JS side is still trying to do writes for stream A, so that never runs out of data. And then once the .NET side has finished reading A, it starts reading B, so everything continues and completes.

Solution

We need to change the backpressure mechanism so that, if the .NET side is unable to write to the pipe due to backpressure, then instead of awaiting, it should reject the write instantly and return a message to the JS side saying the write was rejected due to backpressure and it has to try again later. The JS side should then wait for a short period (say, 1 second, or maybe some exponential backoff with a max of 10 seconds or so) before trying again.

This way, the circuit will never have any long-pending async operations due to backpressure.

@SteveSandersonMS SteveSandersonMS removed their assignment Apr 24, 2023
@SteveSandersonMS
Copy link
Member

@mkArtakMSFT Removing my assignment now the investigation is done. This is definitely a bug.

@sabrite
Copy link
Author

sabrite commented Apr 24, 2023

Thank you @SteveSandersonMS - much appreciated. Since no one responded for almost a month, my work around was to use MemoryStream like this and it always worked irrespective of the number and size of individual/all files (and now I know the reason why it works :)

image

In my particular case, the user could upload N number of images of size < 300MB, so its hard to set MaximumParallelInvocationsPerClient in advance. Nonetheless, its great to know the cause and hope it gets fixed soon.

Thanks again.

@mkArtakMSFT mkArtakMSFT added the bug This issue describes a behavior which is not expected - a bug. label Apr 25, 2023
@mkArtakMSFT mkArtakMSFT added this to the .NET 8 Planning milestone Apr 25, 2023
@ghost
Copy link

ghost commented Apr 25, 2023

Thanks for contacting us.

We're moving this issue to the .NET 8 Planning milestone for future evaluation / consideration. We would like to keep this around to collect more feedback, which can help us with prioritizing this work. We will re-evaluate this issue, during our next planning meeting(s).
If we later determine, that the issue has no community involvement, or it's very rare and low-impact issue, we will close it - so that the team can focus on more important and high impact issues.
To learn more about what to expect next and how this issue will be handled you can read more about our triage process here.

@mkArtakMSFT mkArtakMSFT added the Priority:1 Work that is critical for the release, but we could probably ship without label Apr 25, 2023
@javiercn javiercn modified the milestones: .NET 8 Planning, 9.0-preview1 Oct 4, 2023
@dotnet-policy-service dotnet-policy-service bot added the pending-ci-rerun When assigned to a PR indicates that the CI checks should be rerun label Feb 6, 2024
@wtgodbe wtgodbe removed the pending-ci-rerun When assigned to a PR indicates that the CI checks should be rerun label Feb 6, 2024
@dotnet-policy-service dotnet-policy-service bot added the pending-ci-rerun When assigned to a PR indicates that the CI checks should be rerun label Feb 6, 2024
@wtgodbe wtgodbe removed the pending-ci-rerun When assigned to a PR indicates that the CI checks should be rerun label Feb 13, 2024
@dotnet dotnet deleted a comment from dotnet-policy-service bot Feb 13, 2024
@dotnet dotnet deleted a comment from dotnet-policy-service bot Feb 13, 2024
@DontEatRice
Copy link

This error still occures in .NET 8, solution provided by @sabrite is what we are trying to avoid, because we expect that client can upload up to 50MB and in this scenario, 50MB will be loaded into MemoryStream. I see that this issue is in .NET 9 planning, will it make it to the .NET 9 release?

@UsersHaveNames
Copy link

The error still occurs in .NET 9

@danroth27 danroth27 added Priority:3 Work that is nice to have and removed Priority:1 Work that is critical for the release, but we could probably ship without labels Jan 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-blazor Includes: Blazor, Razor Components bug This issue describes a behavior which is not expected - a bug. Priority:3 Work that is nice to have
Projects
None yet
Development

No branches or pull requests

10 participants