-
Notifications
You must be signed in to change notification settings - Fork 816
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Files larger than 100MB not uploading / better handling of chunk limits #6473
Comments
You can adjust |
AFAIK this only changes it for the Web client.
Businesses often have other preferred ways of mass deploying internal software such as the Desktop client, including config customizations. In addition, as noted, it's not an issue for the Web client. It's possible the Desktop client is being more aggressive than our Web client at determining maximum chunk size. I think the uploader in Web most of the time effectively uses the default of 10 MiB (unless the file is particularly large or the admin has changed it from the default). I haven't fully been through the Desktop client's approach in this area, but I going from memory I got the sense Desktop may start out trying to use as few chunks as possible (and therefore much larger ones), whereas Web kind of does the opposite. I may be wrong though because I only glanced at the code a bit back and so take with a grain of salt. https://github.com/search?q=repo%3Anextcloud%2Fdesktop%20maxchunksize&type=code The Desktop client also has a dynamic mode that adapts based on target duration parameters, which further complicates things (and I'm not aware of the Web client having any such thing). |
P.S. I apologize for leaning a bit more on speculation than I usually prefer in my response a short bit ago. My aim was to add a bit to the discussion, but upon reflection I probably should have just left it as a draft for now. :-) This topic just happens to connect with one I've had on my "to investigate" list for a bit: the behavior of chunking, particularly in a default configuration, in the Desktop versus Web clients. (Lacking time today to get deeply into this, I'm drawing heavily on memory and an incomplete prior review of the Desktop chunking code). This is mostly a disclaimer to take my speculation about differences in behavior with a grain of salt for now. :-) |
Duplicate of #4278 |
Bug description
I am using Nextcloud behind nginx proxy manager and cloudflare. I have given a few family members access to my Nextcloud server, however they cannot upload files larger than 100MB due to the cloudflare upload limit. Now I understand that I can change the max chunksize in the client config file, however this is not a real solution. I can't teach my gandpa how to installl teamviewer so I can fix his shit every time he has a new laptop. And businesses cannot be expected to change this for every pc in a large scale deployment. Since Cloudflare is so widely used either the default chunksize needs to be <100MB or I need to be able to adjust this from the server.
Steps to reproduce
Expected behavior
Installation method
None
Nextcloud Server version
28.0.2
Operating system
Docker Linux (Debian/Truenas Scale)
PHP engine version
8.2.16
Web server
Uncertain of what to put here
Database engine version
PostgreSQL 13.1 (Debian 13.1-1.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
Is this bug present after an update or on a fresh install?
Yes
Are you using the Nextcloud Server Encryption module?
Nope
What user-backends are you using?
Configuration report
No response
List of activated Apps
Default apps + cookbook
Nextcloud Signing status
No response
Nextcloud Logs
No response
Additional info
No response
The text was updated successfully, but these errors were encountered: