-
Notifications
You must be signed in to change notification settings - Fork 115
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Process DB migration 5000 rows at a time instead of all at once. #905
Process DB migration 5000 rows at a time instead of all at once. #905
Conversation
This should allow migrations to work for sites that have lower memory limits, a very large amount of data, or both. Returning all the rows of the stream_tmp table at once can take a lot of memory if there's a lot of data there, causing the migration process to fail.
$context = $wpdb->get_row( | ||
$wpdb->prepare( "SELECT * FROM {$wpdb->base_prefix}stream_context_tmp WHERE record_id = %s LIMIT 1", $entry->ID ) | ||
); | ||
while ( ! empty( $stream_entries = $wpdb->get_results( "SELECT * FROM {$wpdb->base_prefix}stream_tmp LIMIT $starting_row, $rows_per_round" ) ) ) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@johnolek - for simplicity's sake, could you please separate this line into two separate lines?
$stream_entries = $wpdb->get_results( "SELECT * FROM {$wpdb->base_prefix}stream_tmp LIMIT $starting_row, $rows_per_round" );
while ( ! empty( $stream_entries ) ) {
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lukecarbis Sure, I just pushed a change that splits that line into two rows. I also had to add another call right before the end of the loop to attempt to retrieve the next set of results. Let me know if this is what you had in mind.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just addressing the PHPCS issues picked up here: https://travis-ci.org/xwp/stream/builds/190310427
@@ -63,7 +63,9 @@ function wp_stream_update_auto_300( $db_version, $current_version ) { | |||
$starting_row = 0; | |||
$rows_per_round = 5000; | |||
|
|||
while ( ! empty( $stream_entries = $wpdb->get_results( "SELECT * FROM {$wpdb->base_prefix}stream_tmp LIMIT $starting_row, $rows_per_round" ) ) ) { | |||
$stream_entries = $wpdb->get_results( "SELECT * FROM {$wpdb->base_prefix}stream_tmp LIMIT $starting_row, $rows_per_round" ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please use $wpdb->prepare.
@@ -91,6 +93,8 @@ function wp_stream_update_auto_300( $db_version, $current_version ) { | |||
} | |||
|
|||
$starting_row += $rows_per_round; | |||
|
|||
$stream_entries = $wpdb->get_results( "SELECT * FROM {$wpdb->base_prefix}stream_tmp LIMIT $starting_row, $rows_per_round" ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please use $wpdb->prepare.
@lukecarbis I think I finally made all the formatting changes to address the PHPCS issues and it looks like all the checks have passed. Let me know if there's anything else you need updated. |
This should allow migrations to work for sites that have lower memory limits, a very large amount of data, or both. Returning all the rows of the stream_tmp table at once can take a lot of memory if there's a lot of data there, causing the migration process to fail.
One of our sites was running Stream 1.4.9 and had ~2 million rows in its
stream
table. Despite having a generous memory limit, the migration process failed due to exceeding the memory limit when this function attempted to load all 2 million rows into memory at once. Local testing showed that this change allowed for the migration to run successfully with a lower memory limit.