-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pixi install
installs all environments in multi env setting when pypi-dependency
is used
#1046
Comments
I assume this always happens, regardless of the type of requirement. This is expected behavior because all environments have pypi-dependencies (inherited from the default feature). To be able to solve these pypi environments we need to install a python interpreter, this is done by first only installing the conda packages into their target prefixes. |
Hmm, i guess this makes sense... But i still don't like this behavior as this will fill up your drive quite fast if you have a large number of environments that you want to test against 😅 [pypi-dependencies]
polarify = {path = ".", editable = true} to fix #524 but it's annoying that this requires all environments to be actually installed... |
pixi install
installs all environment in multi env setting when pypi-dependency with path
is usedpixi install
installs all environment in multi env setting when pypi-dependency
is used
Do you have envisioned another approach to get rid of |
pixi install
installs all environment in multi env setting when pypi-dependency
is usedpixi install
installs all environments in multi env setting when pypi-dependency
is used
Ah I see your use case. The problem is that to determine the dependencies of your path based project we need python to execute the build backend. Since you have environments with different python versions, the only way to reliably lock this is to invoke python. The only thing I can think of is that we dont need all conda dependencies to run python. However, some of them might be used by the build backend so its also not really clear which dependencies we can skip during this step.. |
it's probably not possible to let the pypi dependencies be evaluated by pixi because the build backend could theoretically do arbitrary stuff? [build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
# ...
dependencies = [
"polars >=0.14.24,<0.21",
] Not sure what other shenanigans hatchling supports. |
Not sure if this is a good idea or not: how about creating a hash of pyproject.toml, setup.py and setup.cfg and if nothing changes don't try to re-resolve the lockfile and thus not download all envs? |
Well the same goes for any source based dependencies (there are still a lot of them). But we are indeed planning to lock the path based dependency based on the hash of the pyproject. |
Actually, how are you handling other platforms? |
Pixi uses the current platforms python to do the solving. If that is not available you get this issue: #1051 |
Also note that the python interpreter is used for source dependencies and for the resolving we only care about the metadata. It is now kind of assumed that once created, this metadata is static across platforms. There are examples where this is not the case, at least thats what I gather from reading python threads. With https://peps.python.org/pep-0643/ we can actually check if the METADATA is static. But I don't know if it will change the behavior much, as there is a lot of old packages, that don't have this. But going onwards from this it will help us avoid having to have the python executable at some point. UV guys helped merge this in warehouse recently. |
I'd like to mention this issue is accentuated if one uses a |
For documentation purpose: @anjos The auto injection is a helper. You are allowed to move it to a feature. e.g. -[tool.pixi.pypi-dependencies]
+[tool.pixi.feature.dev.pypi-dependencies]
test_feature_editable = { path = ".", editable = true }
+[tool.pixi.environments]
+dev = ["dev"]
+ Your comment still holds, just want to get this out there for users that find this issue. |
Thank you, @ruben-arts - that is what we ended up doing for now. |
The text was updated successfully, but these errors were encountered: