-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix ReadAllFromBigQuery leak temp dataset #31895
Conversation
Tested with Previously, there is a "bq_read_all..." dataset left Now, it created one dataset
and after test run, checked that this dataset no longer exists. |
@@ -302,7 +303,7 @@ def _execute_query( | |||
self._job_name, | |||
self._source_uuid, | |||
bigquery_tools.BigQueryJobTypes.QUERY, | |||
'%s_%s' % (int(time.time()), random.randint(0, 1000))) | |||
'%s_%s' % (int(time.time()), secrets.token_hex(3))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in alignment with
beam/sdks/python/apache_beam/io/gcp/bigquery.py
Line 2961 in 0b61035
self.obj_id = '%d_%s' % (int(time.time()), secrets.token_hex(3)) |
likely the cause of #26343
Assigning reviewers. If you would like to opt out of this review, comment R: @damccorm for label python. Available commands:
The PR bot will only process comments in the main thread (not review comments). |
R: @ahmedabu98 |
Stopping reviewer notifications for this pull request: review requested by someone other than the bot, ceding control. If you'd like to restart, comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the fix! LGTM
* Fix ReadAllFromBigQuery leak temp dataset * Fix potential duplicate job name
After this change there is still dataset leak. Found that on Dataflow runner DoFn.teardown is never called even if the job ended successfully (on Direct Runner, teardown is called, however). |
* Fix ReadAllFromBigQuery leak temp dataset * Fix potential duplicate job name
Caught by CleanupGCPResource workflow. There are many stale "bq_read_all_" datasets in the test project.
There is a consistent temp dataset leaking for Python SDK's ReadAllFromBigQuery. Each run will leave an empty dataset prefixed "bq_read_all_"
Cause:
beam/sdks/python/apache_beam/io/gcp/bigquery_read_internal.py
Line 215 in faff55c
beam/sdks/python/apache_beam/io/gcp/bigquery_tools.py
Line 895 in faff55c
Solution:
Remove the prefix so BigQueryWrapper will use the default temp_dataset_* prefix and will delete it when
clean_up_temporary_dataset
calledHowever, this will now create each dataset for each element. To reduce the overload, move create/delete dataset into setup/teardown so that they are per worker basis.
Please add a meaningful description for your change here
Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
addresses #123
), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, commentfixes #<ISSUE NUMBER>
instead.CHANGES.md
with noteworthy changes.See the Contributor Guide for more tips on how to make review process smoother.
To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md
GitHub Actions Tests Status (on master branch)
See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.