-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
#2671 Added GitHub workflow file for historic data migrator #2738
#2671 Added GitHub workflow file for historic data migrator #2738
Conversation
Terraform plan for dev No changes. Your infrastructure matches the configuration.
✅ Plan applied in Deploy to Development and Management Environment #344 |
Terraform plan for meta No changes. Your infrastructure matches the configuration.
✅ Plan applied in Deploy to Development and Management Environment #344 |
Minimum allowed coverage is Generated by 🐒 cobertura-action against a36dbd1 |
Co-authored-by: Alex Steel <[email protected]>
cf_password: ${{ secrets.CF_PASSWORD }} | ||
cf_org: gsa-tts-oros-fac | ||
cf_space: ${{ env.space }} | ||
command: cf run-task gsa-fac -k 2G -m 2G --name historic_data --command "python manage.py historic-data-migrator --dbkeys ${{ inputs.dbkeys }} --years ${{ inputs.years }}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens when this runs as
python manage.py historic-data-migrator --dbkeys --years
?
Since the inputs are not required
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are default values for these params in the code. A similar workflow is end-to-end-test-data-generator.yml
Co-authored-by: Alex Steel <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
As a reminder, when running this via workflow:
- cf login -a api.fr.cloud.gov --sso
- Select our org and space
cf logs gsa-fac | grep "APP/TASK/historic_data"
orcf logs gsa-fac | grep "APP/TASK/historic_data_migrator"
if you change the name to be explicit.- Run workflow
- Periodically check the task with
cf tasks gsa-fac
Co-authored-by: Alex Steel <[email protected]>
* #2671 Added GitHub workflow file for historic data migrator * #2671 Updated ReadMe and workflow * Update .github/workflows/historic-data-migrator.yml Co-authored-by: Alex Steel <[email protected]> * Update .github/workflows/historic-data-migrator.yml Co-authored-by: Alex Steel <[email protected]> * Update .github/workflows/historic-data-migrator.yml Co-authored-by: Alex Steel <[email protected]> --------- Co-authored-by: Alex Steel <[email protected]>
Description
#2671
Added GitHub Workflow Action for Migrating Historical Data.
dev
,staging
andpreview
Updated READMe
PR checklist: submitters
main
into your branch shortly before creating the PR. (You should also be mergingmain
into your branch regularly during development.)git status | grep migrations
. If there are any results, you probably need to add them to the branch for the PR. Your PR should have only one new migration file for each of the component apps, except in rare circumstances; you may need to delete some and re-runpython manage.py makemigrations
to reduce the number to one. (Also, unless in exceptional circumstances, your PR should not delete any migration files.)PR checklist: reviewers
make docker-clean; make docker-first-run && docker compose up
; then rundocker compose exec web /bin/bash -c "python manage.py test"
The larger the PR, the stricter we should be about these points.