-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Epic] Next Steps for DBT / normalization #2566
Comments
All discussions are written in this doc with more details, please feel free to comment!! Anyone should be able to access: |
Upvote for Incremental batch normalization - this was what I was looking for in 2683. The rationale for this as a user is to offload querying the OLTP DB to querying data warehouses, or to avoid using OLTP DBs as warehouse. Also, with regards to incremental batch normalization, looking for low latency querying difference between source and destination, so this would mean the time taken for the data to be normalized in the warehouse. My use case can best be illustrated below, where I have a (relatively) large amount of historical data, and a few rows of new data added per day to an OLTP DB |
Upvote for better |
It seems like the related issue to track this is here: #3520 |
Upvote for better Handling Source Schema Changes. Currently, when I want to add a new table to be synced, Airbyte cleans all of the existing destination tables, which is not a desired behavior. |
The dedicated ticket for incremental normalization support is #4286 |
Tell us about the problem you're trying to solve
Here is a compiled list of features for normalization from discussions/feedbacks with users
Describe the solution you’d like
Google document to discuss and comment on these ideas in more details which would serve to prioritize roadmap around transformations and re-order the items
┆Issue is synchronized with this Asana task by Unito
The text was updated successfully, but these errors were encountered: