-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow running informative queries as part of etl process #989
Labels
Comments
danwilliams
added a commit
that referenced
this issue
Jul 3, 2019
danwilliams
added a commit
that referenced
this issue
Jul 3, 2019
Added id because there needs to be a primary key for SQLAlchemy to map the table.
danwilliams
added a commit
that referenced
this issue
Jul 3, 2019
danwilliams
added a commit
that referenced
this issue
Jul 3, 2019
This will be responsible for running the queries and record the results.
danwilliams
added a commit
that referenced
this issue
Jul 4, 2019
danwilliams
added a commit
that referenced
this issue
Jul 4, 2019
danwilliams
added a commit
that referenced
this issue
Jul 4, 2019
danwilliams
added a commit
that referenced
this issue
Jul 4, 2019
danwilliams
added a commit
that referenced
this issue
Jul 8, 2019
danwilliams
added a commit
that referenced
this issue
Jul 8, 2019
danwilliams
added a commit
that referenced
this issue
Jul 8, 2019
8 tasks
danwilliams
added a commit
that referenced
this issue
Jul 12, 2019
danwilliams
added a commit
that referenced
this issue
Jul 12, 2019
danwilliams
added a commit
that referenced
this issue
Jul 12, 2019
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
FlowETL should allow running supplementary queries on top of the daily ingestion process (see #988). In the future we will want to make this more sophisticated, but in the short run it would be useful to run a series of simple queries whose results are recorded in the table suggested in #988. This should probably happen in a "post-load" step in the etl DAG.
The text was updated successfully, but these errors were encountered: