Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pip sometimes builds install_requires dep AFTER project, which forces cython .pxd imports to be either setup_requires (breaks cython) or pep517 (breaks wheels and therefore tox) #6406

Closed
ghost opened this issue Apr 13, 2019 · 46 comments
Labels
auto-locked Outdated issues that have been locked by automation

Comments

@ghost
Copy link

ghost commented Apr 13, 2019

Ok, so I tried to use tox, and it runs the following command:

cmdargs: '/home/jonas/Develop/myproject/.tox/py37/bin/python -m pip install --no-deps -U \'/home/jonas/Develop/myproject/.tox/dist/myproject-0.6.zip\''

myproject has an install_requires dependency on wobblui, from which it will attempt to .pxd import things. (Cython's cimport)

Now what pip does is build the wheels in this order:

Building wheels for collected packages: myproject, wobblui, nettools
  Building wheel for myproject (setup.py): started

Please note this builds myproject BEFORE it's install_requires dependency. As a result, unsurprisingly, the build fails.

Now this is somewhat problematic because:

Therefore, I think pip should just not do that. Please make it build things always such that install_requires is build first, it brings a host of other troubles for .pxd cross-package cimports/Cython if it just jumbles the order around

@benoit-pierre
Copy link
Member

There's not guarantee with respect to installation order when it comes to install requirements, you should not rely on it.

@benoit-pierre
Copy link
Member

Even if a dependency is installed before, there's no guarantee it will be available during the build phase (as the build may be isolated).

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Hm the isolation remark is interesting, but I'm fairly sure build isolation wasn't disabled and it never was an issue. (not sure why)

Well my suggestion is to make a guarantee, so I can rely on it. That is why I made this ticket. Or is there any advantage in the current behavior of not keeping the order?

As I wrote above especially the time consumption when using setup_requires or build-system.requires can be massive, 5 minutes or longer with a larger Cython dep, and this can lead to 10 minutes or more increases in build time in cross-compilation tools like https://github.com/kivy/python-for-android if they need to analyze the deps. (And there is AFAIK inherently no way to speed this up, since hooks.get_requires_for_build_wheel per design needs to install both setup_requires and "build-system".requires fully before it can analyze the deps, even though in 99% of real-world cases this won't be necessary) So this is a significant inherent problem, because for modern setuptools and hooks.get_requires_for_build_wheel you would actually not need to build the wheel and then wouldn't also need these deps, but the format of setup_requires/"build-system".requires just isn't smart enough right now to understand this

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Sorry, to be more clear: the exact different use cases are 1. deps are needed for build_ext/fully building the wheel/running it (which is the case for .pxd cimport targets) and 2. deps are needed to just run pep517's hooks.get_requires_for_build_wheel on modern setuptools which does not need to build the wheel in that case (which is not the case for .pxd cimport targets). And currently there appears to be no way to differentiate this when specifying package deps

@ghost
Copy link
Author

ghost commented Apr 13, 2019

So there's a lack of "this is needed for build_ext but not for any of the other basic stuff you might want to do with the package" option, and install_requires could fill that gap (and in practice often does in my tests) if pip just obeyed build order (which it often, but sadly not always will) and kept site-packages access to the previous install_requires-built stuff around (which it seems to do)

@benoit-pierre
Copy link
Member

What basic stuff? If it's a build time dependency, it needs to be build before being installed. If you want to speed up things, build wheels, so those build time dependency don't have to be build from source.

@benoit-pierre
Copy link
Member

It really would be easier to understand what's the exact issue if you push an example or the offending project to Github and provide the exact command being used.

@ghost
Copy link
Author

ghost commented Apr 13, 2019

How simply you just put it simply doesn't reflect the actual pep517-landscape.

There's not just "build time dependency" and "needed at runtime dependency". There is also "needed just to run basic setuptools and examine package" and currently that's lumped in together with "build time dependency"

@ghost
Copy link
Author

ghost commented Apr 13, 2019

@ example: that really depends on what you want to see. If you want to see the slow-down, you need to write code using pep517 examining a package using hooks.get_requires_for_build_wheel. That is the only current way in python packaging to get the dependencies of an arbitrary python package without actually building the wheel. (which is very important for basic packaging analysis e.g. how we need it for https://github.com/kivy/python-for-android without wasting lots of time building the wheels)

So the problem here really is in the nuances of package analysis of packages that are not installed without actually installing them, which historically I think was impossible but with pep517 it now is, but there slamming unnecessary stuff into setup_requires/build-system.requires really tanks the analysis time in an extremely undesirable manner

@benoit-pierre
Copy link
Member

First, there's PEP 518 and PEP 517. PEP 518 is the replacement for setup_requires. PEP 517 is for supporting multiple build systems (as in for setuptools replacements). setuptools does provide a PEP 517 compatible backend.

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Just for clarity, I did actually mean pep517 interacting with build-system.requires/pep 518. We use pep517 for packaging analysis in python-for-android to map the deps to cross-compile patches when packaging for android

Edit: but this affects really any tool that wants to be able to visualize dependencies without actually installing the package. I realize that's not super common but I'd say it shouldn't be treated as super arcane either

@benoit-pierre
Copy link
Member

Build wheels, than analyze those?

@ghost
Copy link
Author

ghost commented Apr 13, 2019

pep517 combined with new setuptools has hooks.get_requires_for_build_wheel where you can get the metadata without building the wheel. I cannot understate how important that is because building the wheels takes really really long for some projects, so this makes a massive (like 90% time decrease or more) difference in analysis time. So if for any .pxd imports I need to put them into the basic setup_requires or build-system.requires I will tank this analysis time massively although for that step it wouldn't even be needed. That is why it would be so much better if it worked just specifying these things via install_requires - which again, already does almost work in practice, since pip usually obeys install order. The problem really just arises when it doesn't

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Sorry if I express all of this poorly, I got into this mostly by necessity. (I work on Cython libraries and python-for-android components that deal with this) I'm really not a packaging expert

@benoit-pierre
Copy link
Member

Because things are not declarative, there's no way to inspect source packages without involving the build systems and build dependencies because you never know when they are going to be needed (like an import in setup.py to provide custom commands)... ¯\(ツ)

In theory, a valid source distribution PKG-INFO could provide all the necessary information.

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Because things are not declarative, there's no way to inspect source packages without involving the build systems and build dependencies because you never know when they are going to be needed (like an import in setup.py to provide custom commands)... ¯(ツ)/¯

That is a simplified view that really prevents a useful discussion. With pep517 there are different type of how far you go into the build, and you do not need to run the full build system anymore. That is new and very beneficial, but the current way .pxd deps need to be treated as basic build system deps (when they are not needed for these analysis things) is just really hurtful to the potential speedups from that

@benoit-pierre
Copy link
Member

In theory, because setuptools currently does not fill in all the metadata (Requires-Dist entries are missing).

@ghost
Copy link
Author

ghost commented Apr 13, 2019

If "in theory" refers to getting deps without fully buliding the wheel, no that actually works, you can analyze with pep517 without building the wheel. We're doing that, I'm not making that up 😄

In theory, a valid source distribution PKG-INFO could provide all the necessary information.

As for this, yes, but to put that together you also need to install build-system.requires since it makes no differentiation between basic packaging tools needed (like setuptools, poetry, ...) and stuff only needed for build_ext like .pxd imports that wouldn't be needed for that. That essentially is really the core of the problem

@benoit-pierre
Copy link
Member

benoit-pierre commented Apr 13, 2019

How far you go with the build does not mean anything when 99.9% of the projects run python code in setup.py. I mean the whole point of PEP 518 is to be able to do import build_dep at the top of setup.py so you don't have to do the dynamic import complicated custom command thing anymore.

@benoit-pierre
Copy link
Member

No, a source distribution must contain a pre-generated PKG-INFO entry. The problem is pip support non source distributions too, like VCS checkouts or snapshots, for those there's no way around the need to run setup.py egg-info, or whatever PEP 517 equivalent.

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Again, this is not theoretical. You can have .pxd deps that are only needed for bulid_ext, and pep517 will be able to analyze the deps with setuptools without buliding the wheel. Therefore, I am really not sure what you are trying to discuss here or "how far [...] does not mean anything" is supposed to mean. Because yes, yes it really does mean something and it makes a huge practical difference whether you need to go through build_ext or not, and in reality with pep517 you mostly don't need to go through it (in which case having stuff needed for that in setup_requires becomes really a huge time waste, which why it would be so much nicer if e.g. install_requires could serve for this as it already mostly does, but sadly not in all corner cases)

@benoit-pierre
Copy link
Member

OK, sure, you have a use case, but that's not in the PEP.

@benoit-pierre
Copy link
Member

Or are you having an issue with the particular PEP 517 implementation you're using?

@ghost
Copy link
Author

ghost commented Apr 13, 2019

I never claimed it was @ covered by any PEP standard. The only thing I'm saying is hey, if pip installed install_require order then the world would be a lot nicer for this use case. Is this so difficult to understand? Sorry that I'm getting frustrated over here, I don't get what is so complicated about this

@benoit-pierre
Copy link
Member

No, because there's no guarantee those run time requirements must be available during any of the build phase (and they won't be if the builds are isolated).

@ghost
Copy link
Author

ghost commented Apr 13, 2019

This is all so problematic because it seems to be a case nobody thought of (hence not specified as an option ever) but install_requires just happens to mostly fill the gap here. But it doesn't when pip jumbles up the order. That is why it is really sad that it does that. Does that make any sense? I'm really not sure how to express it any better

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Build isolation is interesting, I never had that issue. Are you sure the deps are not around? Build isolation is enabled by default, isn't it? It seems pip always usually puts the install_requires into the site packages first in a normal build run, and somehow makes them available during build_ext as site-packages even despite build isolation. At least I can't imagine what is even going on if that is not the case, that would go against all the things I have seen

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Ok, here is an actual example I just tried:

setup.py:

from Cython.Build import cythonize
import sys
from setuptools import setup
from setuptools.command.build_ext import build_ext

class my_build_ext_hook(build_ext):
    def run(self):
        import PIL
        print("IMPORTED PILLOW!", file=sys.stderr)
        super().run()

setup(
    name="test2",
    version="0.1",
    cmdclass={
        "build_ext": my_build_ext_hook,
    },
    install_requires=["Pillow"],
    packages=["test2"],
    ext_modules = cythonize("test2/__init__.pyx"),
)

test2/__init__.pyx:

cpdef hello():
    print("Hello World!")

Build in isolated mode where the build_ext step as you can see above imports an install_requires dependency:

$ pip3 install --isolated --user -U . -v
Created temporary directory: /tmp/pip-ephem-wheel-cache-_pv3bgev
Created temporary directory: /tmp/pip-req-tracker-rznzs4v7
Created requirements tracker '/tmp/pip-req-tracker-rznzs4v7'
Created temporary directory: /tmp/pip-install-1xbv41vq
Processing /home/jonas/test2
  Created temporary directory: /tmp/pip-req-build-cymfupyv
  Added file:///home/jonas/test2 to build tracker '/tmp/pip-req-tracker-rznzs4v7'
  Running setup.py (path:/tmp/pip-req-build-cymfupyv/setup.py) egg_info for package from file:///home/jonas/test2
    Running command python setup.py egg_info
    Compiling test2/__init__.pyx because it changed.
    [1/1] Cythonizing test2/__init__.pyx
    running egg_info
    creating pip-egg-info/test2.egg-info
    writing pip-egg-info/test2.egg-info/PKG-INFO
    writing dependency_links to pip-egg-info/test2.egg-info/dependency_links.txt
    writing requirements to pip-egg-info/test2.egg-info/requires.txt
    writing top-level names to pip-egg-info/test2.egg-info/top_level.txt
    writing manifest file 'pip-egg-info/test2.egg-info/SOURCES.txt'
    /home/jonas/.local/lib/python3.7/site-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /tmp/pip-req-build-cymfupyv/test2/__init__.pyx
      tree = Parsing.p_module(s, pxd, full_module_name)
    package init file 'test2/__init__.py' not found (or not a regular file)
    reading manifest file 'pip-egg-info/test2.egg-info/SOURCES.txt'
    writing manifest file 'pip-egg-info/test2.egg-info/SOURCES.txt'
  Source in /tmp/pip-req-build-cymfupyv has version 0.1, which satisfies requirement test2==0.1 from file:///home/jonas/test2
  Removed test2==0.1 from file:///home/jonas/test2 from build tracker '/tmp/pip-req-tracker-rznzs4v7'
Requirement already satisfied, skipping upgrade: Pillow in /usr/lib64/python3.7/site-packages (from test2==0.1) (5.3.0)
Could not parse version from link: file:///home/jonas/test2
Building wheels for collected packages: test2
  Created temporary directory: /tmp/pip-wheel-2f82htvd
  Running setup.py bdist_wheel for test2 ...   Destination directory: /tmp/pip-wheel-2f82htvd
  Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-req-build-cymfupyv/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-2f82htvd --python-tag cp37
  running bdist_wheel
  running build
  running build_py
  package init file 'test2/__init__.py' not found (or not a regular file)
  running build_ext
  IMPORTED PILLOW!
  building 'test2.__init__' extension
  creating build
  creating build/temp.linux-x86_64-3.7
  creating build/temp.linux-x86_64-3.7/test2
  gcc -pthread -Wno-unused-result -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fexceptions -fstack-protector-strong -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python3.7m -c test2/__init__.c -o build/temp.linux-x86_64-3.7/test2/__init__.o
  creating build/lib.linux-x86_64-3.7
  creating build/lib.linux-x86_64-3.7/test2
  gcc -pthread -shared -Wl,-z,relro -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -g build/temp.linux-x86_64-3.7/test2/__init__.o -L/usr/lib64 -lpython3.7m -o build/lib.linux-x86_64-3.7/test2/__init__.cpython-37m-x86_64-linux-gnu.so
  installing to build/bdist.linux-x86_64/wheel
  running install
  running install_lib
  creating build/bdist.linux-x86_64
  creating build/bdist.linux-x86_64/wheel
  creating build/bdist.linux-x86_64/wheel/test2
  copying build/lib.linux-x86_64-3.7/test2/__init__.cpython-37m-x86_64-linux-gnu.so -> build/bdist.linux-x86_64/wheel/test2
  running install_egg_info
  running egg_info
  creating test2.egg-info
  writing test2.egg-info/PKG-INFO
  writing dependency_links to test2.egg-info/dependency_links.txt
  writing requirements to test2.egg-info/requires.txt
  writing top-level names to test2.egg-info/top_level.txt
  writing manifest file 'test2.egg-info/SOURCES.txt'
  reading manifest file 'test2.egg-info/SOURCES.txt'
  writing manifest file 'test2.egg-info/SOURCES.txt'
  Copying test2.egg-info to build/bdist.linux-x86_64/wheel/test2-0.1-py3.7.egg-info
  running install_scripts
  creating build/bdist.linux-x86_64/wheel/test2-0.1.dist-info/WHEEL
  creating '/tmp/pip-wheel-2f82htvd/test2-0.1-cp37-cp37m-linux_x86_64.whl' and adding '.' to it
  adding 'test2/__init__.cpython-37m-x86_64-linux-gnu.so'
  adding 'test2-0.1.dist-info/top_level.txt'
  adding 'test2-0.1.dist-info/WHEEL'
  adding 'test2-0.1.dist-info/METADATA'
  adding 'test2-0.1.dist-info/RECORD'
  removing build/bdist.linux-x86_64/wheel
done
  Stored in directory: /tmp/pip-ephem-wheel-cache-_pv3bgev/wheels/79/c4/01/50d26ce9316f224b38f2f5060ad66b67e0b98bbda9ddde026b
  Removing source in /tmp/pip-req-build-cymfupyv
Successfully built test2
Installing collected packages: test2

Successfully installed test2-0.1
Cleaning up...
Removed build tracker '/tmp/pip-req-tracker-rznzs4v7'

I can now also use the module:

$ python3
Python 3.7.2 (default, Mar 21 2019, 10:09:12) 
[GCC 8.3.1 20190223 (Red Hat 8.3.1-2)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import test2
>>> test2.hello()
Hello World!

So the only issue here seems to be that only sometimes the install order isn't in dependency order. If that weren't the case it would work perfectly. I don't know why that is, but that is how things seem to be, and since order seemed to me like something that may be fixable I made this ticket!

I hope this clears up things a little

@benoit-pierre
Copy link
Member

You're confusing running pip isolated (--isolated: ignoring environment variables and user configuration) and build isolation, which is automatically enabled when a package uses PEP 518 / 517 (and can be disabled for PEP 518 only with --no-build-isolation).

@benoit-pierre
Copy link
Member

And your actual example would not work when using a clean virtualenv, no? Since the Cython build time requirement is not declared anywhere.

@ghost
Copy link
Author

ghost commented Apr 13, 2019

No Cython not being specified is not correct, I agree. But I wasn't trying to make a perfect package, just show that apparently the isolation allows access to install_requires.

I just added a PEP518 / pyproject.toml file:

[build-system]
requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"

Then I added this MANIFEST.in:

include pyproject.toml
recursive-include *.so *.pyx test2/

And the install still works, including uninstalling & freshly reinstalling. Or do I still need to do something else to enable build isolation?

@benoit-pierre
Copy link
Member

Scratch that, it might depending on the install order... and only if Pillow does the same. Extremely ugly, and broken. You should not need to install Cython when installing from a wheel for example, since it's a build dependency.

@benoit-pierre
Copy link
Member

Can you upload your example zip?

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Sure! Here you go
test2-example.zip

@ghost
Copy link
Author

ghost commented Apr 13, 2019

And maybe you're right, and install_requires for this with pip changed to obey install order always (which it doesn't right now) is a horrible idea. All I'm trying to say is that from my point of view it looks like not the worst thing to try, but again I'm not a packaging expert so if you're all very certain this is nonsense then I'll take that response 🤷‍♀️

@benoit-pierre
Copy link
Member

Are you sure you're using an up-to-date pip?

@benoit-pierre
Copy link
Member

Using pip install . (with a clean virtualenv):

Processing [...]/test2
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'error'
[...]
ModuleNotFoundError: No module named 'Cython'

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Oh, fascinating! I wasn't, I was using 18.1. Now I updated to 19.x and you're right, it doesn't work anymore. (Even when adding Cython to build-system.requires) So I suppose using install_requires for this is just the wrong idea, and I'll close the ticket. Sorry for all the confusion 😂 it looked like a reasonable approach to me from my pip version and from what I've tried, but I suppose given the new build isolation it really doesn't make any sense

@ghost ghost closed this as completed Apr 13, 2019
@ghost
Copy link
Author

ghost commented Apr 13, 2019

Thanks so much for all your responses! It's sad this doesn't make too much sense then, but I really learned a lot

@benoit-pierre
Copy link
Member

benoit-pierre commented Apr 13, 2019

I do think guaranteeing the order when handling install_requires is wrong simply because build time dependencies do not belong there, so the order should not matter and is an implementation detail. Again, even if it worked, it would mean all installs (including from wheels) would pull an unnecessary dependency.

And how do you handle the case where 2 packages need different versions of the same build time dependency?

@ghost
Copy link
Author

ghost commented Apr 13, 2019

Yeah you are right, I thought it helped with this .pxd use case that inherently won't work under 19.x anyway. It's nonsense 😂 sorry for using up so much of your time, and thanks again for the responses!

@jdemeyer
Copy link

There's not guarantee with respect to installation order when it comes to install requirements, you should not rely on it.

Are you sure? This documentation seems to suggest otherwise: https://pip.pypa.io/en/stable/reference/pip_install/#installation-order

@benoit-pierre
Copy link
Member

@jdemeyer: yeah, there's some old code in pip that's supposed to do this, but:

  • installation order in not part of a PEP
  • it's clearly buggy

So there's still no guarantee it will work, and it definitely won't help when build isolation is used. Really some of the arguments about the benefits don't make sense:

  1. Concurrent use of the environment during the install is more likely to work.

Likely, right...

  1. A failed install is less likely to leave a broken environment. Although pip would like to support failure rollbacks eventually, in the mean time, this is an improvement.

Pip has rollbacks.

Old code, old documentation, IMHO, it would be better if at least that part of the documentation was removed.

@benoit-pierre
Copy link
Member

Case in point, if wheel is installed, pip will first build wheels for all packages, then install them:

  • without wheel installed:
$ ./venv/bin/python src/pip freeze --all                                                         
pip==19.0.dev0
setuptools==41.0.0
$ ./venv/bin/python src/pip install -t tmp TopoRequires4 --no-index -f tests/data/packages
Looking in links: tests/data/packages
Collecting TopoRequires4
Collecting TopoRequires2 (from TopoRequires4)
Collecting TopoRequires (from TopoRequires4)
Collecting TopoRequires3 (from TopoRequires4)
Installing collected packages: TopoRequires, TopoRequires2, TopoRequires3, TopoRequires4
  Running setup.py install for TopoRequires: started
    Running setup.py install for TopoRequires: finished with status 'done'
  Running setup.py install for TopoRequires2: started
    Running setup.py install for TopoRequires2: finished with status 'done'
  Running setup.py install for TopoRequires3: started
    Running setup.py install for TopoRequires3: finished with status 'done'
  Running setup.py install for TopoRequires4: started
    Running setup.py install for TopoRequires4: finished with status 'done'
Successfully installed TopoRequires-0.0.1 TopoRequires2-0.0.1 TopoRequires3-0.0.1 TopoRequires4-0.0.1
  • with wheel installed:
$ ./venv/bin/python src/pip freeze --all                                                         
pip==19.0.dev0
setuptools==41.0.0
wheel==0.33.1
$ ./venv/bin/python src/pip install -t tmp TopoRequires4 --no-index -f tests/data/packages
Looking in links: tests/data/packages
Collecting TopoRequires4
Collecting TopoRequires2 (from TopoRequires4)
Collecting TopoRequires (from TopoRequires4)
Collecting TopoRequires3 (from TopoRequires4)
Building wheels for collected packages: TopoRequires4, TopoRequires2, TopoRequires, TopoRequires3
  Building wheel for TopoRequires4 (setup.py): started
  Building wheel for TopoRequires4 (setup.py): finished with status 'done'
  Stored in directory: /home/bpierre/.cache/pip/wheels/94/75/42/aaf87ce968d605bb557fc354b6c60bf98590d40f6e83675489
  Building wheel for TopoRequires2 (setup.py): started
  Building wheel for TopoRequires2 (setup.py): finished with status 'done'
  Stored in directory: /home/bpierre/.cache/pip/wheels/a1/6a/95/944f370b4d5f04b46d2ce7352dca59dca33a7aa0bcc59a3295
  Building wheel for TopoRequires (setup.py): started
  Building wheel for TopoRequires (setup.py): finished with status 'done'
  Stored in directory: /home/bpierre/.cache/pip/wheels/cd/e5/f7/72357310b735e0151afaf2b24736c79c276ee08c32e3e98f91
  Building wheel for TopoRequires3 (setup.py): started
  Building wheel for TopoRequires3 (setup.py): finished with status 'done'
  Stored in directory: /home/bpierre/.cache/pip/wheels/c8/43/5c/58057081e51597ec84fa61de1764cb3816854a58b003574768
Successfully built TopoRequires4 TopoRequires2 TopoRequires TopoRequires3
Installing collected packages: TopoRequires, TopoRequires2, TopoRequires3, TopoRequires4
Successfully installed TopoRequires-0.0.1 TopoRequires2-0.0.1 TopoRequires3-0.0.1 TopoRequires4-0.0.1
Target directory /home/bpierre/progs/src/pip/tmp/toporequires4 already exists. Specify --upgrade to force replacement.
Target directory /home/bpierre/progs/src/pip/tmp/toporequires3 already exists. Specify --upgrade to force replacement.
Target directory /home/bpierre/progs/src/pip/tmp/toporequires2 already exists. Specify --upgrade to force replacement.
Target directory /home/bpierre/progs/src/pip/tmp/toporequires already exists. Specify --upgrade to force replacement.

@jdemeyer
Copy link

Interesting analysis. So this isn't caused by pyproject.toml but by wheel.

@lock
Copy link

lock bot commented May 28, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot added the auto-locked Outdated issues that have been locked by automation label May 28, 2019
@lock lock bot locked as resolved and limited conversation to collaborators May 28, 2019
This issue was closed.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
auto-locked Outdated issues that have been locked by automation
Projects
None yet
Development

No branches or pull requests

2 participants