-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ADAP-692] [Bug] snowflake-connector-python
dependency version too strict
#687
Comments
snowflake-connector-python
dependency version too strictsnowflake-connector-python
dependency version too strict
@ivanstillfront thanks for opening this and doing your homework! I understand the frustration of too strict dependencies. My hesitation is that we often experience user-facing issues that require fixes in the upstream connector library. The PR in which the snowflake connector version was bumped, #476, makes reference to #393 (correspondingly snowflakedb/snowflake-connector-python#1274) As we speak, we're finally on a path to resolving a long-standing multi-threading seg fault issue that dbt-snowflake users have been experiencing in both Core and Cloud that will also likely be resolved in snowflakedb/snowflake-connector-python#1627. Once this happens we plan to bump the minimum required version of the connector to I'm not trying to say that we don't want to fix your problem, rather that's the perspective from which we've been thinking lately. Maybe there's a middle ground here? Can we open a PR with airflow to bump the upper limit on their version constraint? I'd love to hear more about your perspective on next steps. cheers |
Thank you @dataders , @nenkie76 has opened an issue with AWS MWAA but I doubt they will modify the constraints because they are provided by Airflow Since AWS maintains the constraints for their managed Airflow system our hands are tied in this matter. Our only options are to fork |
@ivanstillfront appreciate the teamwork! I accidentally went down a rabbit hole and have some information to share. your file, constraints-2.5.1/constraints-3.10.txt does indeed have an earlier version of the connector library, but the version I saw on mentioned on the main branch of the Airflow repo is constraints-main/constraints-3.8.txt, which actually has the latest version of aws/aws-mwaa-local-runner#243 was instructive to discover that this AWS managed airflow repo depends on another PyPI package sauce, Therefore I don't believe that Instead, I'm very suspicious of how aws/aws-mwaa-local-runner's Docker image is configured. There is a hard-coded, committed The first two lines of that file are telling # This constraints file was automatically generated on 2023-01-18T18:46:04Z
# via "eager-upgrade" mechanism of PIP. For the "v2-5-test" branch of Airflow. I still am very uncertain as to how Airflow uses a static GitHub links to auto-generate these constraint files, but certain enough to be suspicious of hard-coding. Airflow's guidance on the constraints file, apache/airflow/blob/main/constraints/README.md, says the following
I think the real "issue" is that this constraints file is being used "in production" when it really is only for a temporary, local development and CI build environments. The smoking gun for me is that all of those dependencies are hard-pinned (`==) which by definition is not flexible. The workaround, imo, would be to submit a pull request to aws/aws-mwaa-local-runner in which you updated the existing The long-term solution is to improve the version specification strategy that is implemented by aws/aws-mwaa-local-runner |
@ivanstillfront closing this issue for now. please re-open if you think there's a way we can be of help here |
@dataders thank you for that elaborate investigation, sorry I could not reply sooner. You are correct, AWS is using the constraints "in production" which is a huge PITA to work with. Here is their documentation on this matter: https://docs.aws.amazon.com/mwaa/latest/userguide/working-dags-dependencies.html#working-dags-dependencies-syntax-create Long term, we will most likely stop using AWS managed Airflow and roll our own deployment. For now, we have forked |
Is this a new bug in dbt-snowflake?
Current Behavior
When using dbt with Airflow 2.5.1, python package installation fails because Airflow constraints
snowflake-connector-python
version to 2.9.0 which is incompatible todbt-snowflake
which requires 3.0.0Expected Behavior
Package installation should not fail.
Steps To Reproduce
Create a
requirements.txt
file with content:pip install -r requirements.txt
will failRelevant log output
Environment
Additional Context
The
snowflake-connector-python
requirement was bumped to3.0.0
because of a snyk vulnerability see: 967a8e9But it would have been sufficient to bump it to 2.8.2 or higher. The
dbt-snowflake
library can be a lot more flexible when the requirement is defined like so:"snowflake-connector-python[secure-local-storage]>=2.8.2"
Happy to push a PR if the maintainers agree with this change.
The text was updated successfully, but these errors were encountered: