-
-
Notifications
You must be signed in to change notification settings - Fork 8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
parse_dom_table function performance #1626
Comments
well, my code change above was wrong, and i figured out that for export 200 cols * 10000 rows table, parse_dom_table function don't out-of-memory any more, but jszip utf8ToBytes function got out-of-memory. |
@ThomasChan thanks for looking into this, and feel free to submit a PR. There's definitely room for improvement. The weird loop is done that way to address a case like:
The first cell in the second row should be located at F2, but to determine that you need to look at the A1:C2 merge first then the D1:E2 merge. The implementation was designed expecting only a small number of merges. IF you have many, then the approach is extremely slow. Given the parse order, it will always be sorted by starting row then by starting column. To reduce it to a single walk through the merge array, you might be able to sort by ending row then by starting column (sorting the array with a custom sort function). Then you'd keep track of a starting index into the array (elements before that point could never affect the result, so you can skip them). So that we have a sense for the performance, can you share a sample table that you are trying to convert? |
Thanks for reply, i had submit a PR, and sample table is nothing special to any html table, you can create a BIG table like i said above, 200 cols * 10000 rows, then use xlsx to export like var title = 'Big Table';
var writeOptions = {
Props: {
Title: title,
CreatedDate: new Date().toLocaleString(),
},
type: 'binary', // if not use binary will out-of-memory
};
var wb = XLSX.utils.book_new();
var ws = XLSX.utils.table_to_sheet(t, {
dense: true, // no dense also out-of-memory
raw: true, // no raw will slow down performance
});
XLSX.utils.book_append_sheet(wb, ws, title);
XLSX.writeFile(
wb,
`${title}.xlsx`,
writeOptions,
); |
https://github.com/SheetJS/js-xlsx/blob/master/xlsx.js#L19061
The
merges
for loop will became extremely large and slow while handling large table, i had test this with 204 cols * 250 rows table, without optimizews[!merges]
result a huge array, and almost of item is merging single cell itself which is useless.before optimize, in my test case, export function excute 18.6s, and after, only excute 4.67s.
and in my customer's client, they were exporting a 193 cols * 1277 rows table, export function excute 6mins, after optimize, only excute 15s.
code change henglabs@febac23
The text was updated successfully, but these errors were encountered: