-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tracking closed requirements leads to uncomplete checks in workflows #30
Comments
I believe this is the behavior we agreed to have by design. I remember we discussed the "automatic" upgrade of packages one day in the office, and you wanted (for stability) to have this behavior by default, thus avoiding full rebuild of requirements file. I kinda agree with what you said that day. Nevertheless to fully update the requirements and not use locked versions (from the output file), one could either delete .txt or use a pip-tool flag (--upgrade). I would KEEP this as-is for safety, and once in a while we do this full re-build of requirementstxt
I also believe this is correct. User installation should refer to the open requirements, also because the py4ai open dependencies should then be resolved together with all others packages one user might have. Unfortunately, there is no way we could perfectly recreate ANY user environment. |
I agree that we cannot check all possible requirements configurations but, in the workflows, we should probably check the confuguration the user would get in an empty environment (i.e. the one with most updaded requirements). |
Uhm, I'm not really sure about this. I mean, I believe there is value on having reproducible and stable pipelines. Look at this use-case (quite common actually): we keep the requirements open => we merge a PR when everything works => someone updates a package in the world => our stable version (which used to work), it doesn't anymore. True, we figure this out promptly, but it is very hard to have even CI/CD pipeline stable on main branches. Between the two, I would go with the latter. I believe that we should have a stable env with package upgrades ALWAYS being done via commits/PR. Rather, we are currently experimenting tools that automatically detect upgrades and work for keeping dependencies fresh. When there is a new upgrade, they would automatically detect, change the code and open a PR. I'd rather go for this, rather than removing strict dependencies (needed to keep stable env) |
On this, I believe it is more important to have stable/reproducable env. If things break on empty envs (because of external package upgrade we can't control), the users can always figure out the constraints to impose to dependencies to make it work, using the requirements.txt |
But we would not lose reproducible envs. We would still have them on our machines, only they would not be versioned. |
I don't know... I really think that the actual situation is problematic since we have no (automatic) means to check if what a user gets from PyPI actually works out of the box or not and I think that not tracking the closed requirements would be the quickest fix to this situation. |
I know he can but:
I think it would be much easier and clearer (for the user at least) if it were us to handle the compatibility issues as soon as we (automatically) detect them. |
Oh, by the way, I'd love if we were able to actively track updates in our dependencies and be able to automatically update our stable, closed, requirements.... But I actually have no clue on how to do it. |
Please, have a look at this tool I posted above. |
Sounds a bit like "works on my machine" :D To be honest, I really don't like that things may break on stable branches because of external updates. Once again, if we run the CI/CD pipeline for a given commit on different days, could it break? if yes, I really believe it is bad. In order to prevent this, any commit should provide the requiremets_*txt file, build the minimal envs from this that allows tests and checks to pass. Of course, when someone integrates the package in their project, envs will be different, but as we said, we cannot reproduce this condition. We should then have a CI/CD that updates (this should be easy if one deletes the requirement and then creates a new one, or using pip-compile with --update) the lock file regularly (once a day or once a week), creating a PR (or even committing to the main branch automatically) if things work or an alert if things break. This last bit is what renovate does. This workflow is the one that makes more sense to me. |
I like this solution with the automatic upgrade. |
I just realised that
pip-compile
uses the output file as a lock file.This implies that the commands
pip-compile --output-file "requirements/requirements.txt" --quiet --no-emit-index-url subset.in
andpip-compile --output-file requirements/requirements_dev.txt --quiet --no-emit-index-url requirements/requirements.in requirements/requirements_dev.in
do not update closed requirements in case locked values already respect constraints given in*.in
input files.This implies that checks run in CI and CD workflows do not run in exactly the same environment that a new user would have when installing the package from PyPI.
To avoid this, I'd suggest to ignore closed requirements files.
The text was updated successfully, but these errors were encountered: