-
-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to read large files? Is there a way to read them in a streaming? #237
Comments
I solved this problem with |
@Zzaniu |
@MathNya I am also looking for same. Any work arounds at this moment to support streaming? Appreciate your response I specifically need Streaming while writing rows to Excel workbook sheet, |
@BharathIO |
Ok. Thanks for the update @MathNya . When i am writing around 17k records into sheet, i observed it is consuming more memory. Any way i can do to use it less memory? Please share your thoughts I observed it consumed 800 to 900 MB of RAM while writing 17k records |
Is there any way to write custom serializer/deserializer for my usecase to process 17k records? |
17k records in memory is probably going to take large amounts of RAM by themselves. |
But when i observed, while using other libraries like xlsxwriter or so, it did not consume large amount of RAM. umya-spreadsheet library has lot more features compared to other libraries, only issue is with large amount of RAM consumption and high CPU Utilization. Any workarounds at this moment? |
I believe there are venues to reduce the memory footprint of many data types, but essentially more features = more data types to keep track of. Therefore it is not always avoidable. |
Try my PR to see how much it reduces... |
Great, i could see a bigger change now in terms of Memory & CPU utilization. I will validate few more use cases and post my observations here. |
@BharathIO would be interested to know the memory usage in the new version. |
Big file read, then memory burst
The text was updated successfully, but these errors were encountered: