-
Notifications
You must be signed in to change notification settings - Fork 256
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Verify/Document] Native (conditional) dependency support. #300
Comments
@nathanaeljones Have you done any case studies of other package managers that may have these same problems? Do they do conditional dependencies based on arbitrary pivots? Does that condition on the dependency edge affect version selection? I know npm supports native modules being compiled on the target machine at install time. Are you suggesting that we support the full spectrum of things?
We're enhancing MSBuild in vs14 with NuGet aware targets that understand runtime/vs compile time.
This is what we do today with things like kestrel in ASP.NET 5. See my comment here imazen/Imazen.NativeDependencyManager#5 (comment). I'm not sure about making these new things work down level on all CLRs but newer ones might be more viable.
Transitive dependencies are going to be understood with build time and runtime nuget resolution coming into the platform. I'm actually not a fan of complicating the graph walk by adding arbitrary pivots to it. I think that's actually an intractable problem. I do however agree with the fact that we do need conventions to describe these things in NuGet packages and have appropriate behaviors at runtime and build time. |
Conditional packages based on platform seems to be very widespread. npm has optionaldependencies and platform filtering, bundler has :platform=> . The problem with Bundler, (and the reason windows support is troubled) is its lack of support for pivots within the lockfile. Bundler supports platform pivots, but doesn't generate different lockfiles for each target. I think this is considered a flaw by the Bundler team (although I can't find the link; could have been mentioned at rubyconf 2014). There are the choices I see:
I'm a fan of #4. When it's small, building from source is fine. But msvc can be 10-50x slower than gcc; building a tiny graphics library (libgd) with 3 dependencies takes over 45 minutes on windows. On linux, that's 4 minutes. Both options should be present. I'm also not suggesting arbitrary pivots; rather just 4 pivots. That's a manageable number, I think. If you limit it to modern platforms, there are only 12 permutations. In practice, we're going to see binaries for windows, and source code for everything else. That's 3 permutations, 4-5 if you include ARM processors. I'd expect binaries for *nix platforms only where compilation is excessively slow. Is the graph walk so resource-intensive that we can't take a 3-5x hit for the portions of that graph which have conditional-based native dependencies? I'm not sure I see how the graph is an intractable problem, especially if we only have to "solve" it for the targets that the consuming project defines. Hey, — if the nuget server can send back dependency information for a package without sending the whole package — even the most naive approach should be just fine. You should really talk to @wycats. He's the library package manager design expert, and he's dealt with this problem in many forms. |
Walking the graph isn't hard, picking versions based on compatible pivots is hard and leads to unpredictability in the algorithm, which I think is much worse. |
I would not select versions based on any of these pivots. I'm not even fully convinced that picking versions based on the .NET framework version is a good idea. |
It's far better to "let it fail" than to make the algorithm unpredictable. If it fails, it can be fixed. If nothing fails, there is no pain point to repair, and that platform may remain stuck in the past forever. |
We are going to do something for RTM for sure. Will loop back when there is more meaty code to talk about. |
I hope this can be designed in the open. |
Are there any specs/documents/drafts we can review? |
Not yet, still working on it |
@nathanaeljones we are going to have project.json and a lock file as well as transitive restore supporting install.ps1 uninstall.ps1 content and targets fully in Visual Studio. The code is already public in NuGet but we are still playing around with it. It requires an msbuild task/target file that is being worked on as well (but is not public yet). I don't understand the native picture 100% so I don't want to add misleading information until its properly documented. |
@yishaigalatzer Have you resolved the issues with the msbuild team? Last I heard they were refusing to support NuGet (or any package manager, for that matter). I think the requirement was to enable reloading of the build file once the package manager has finished its work. Packet and NuGet both ran into the same issue, I think. We'd all like to see a solution that can work without a VS-specific extension, but right now it appears to be at an impasse. The other issue is that AppDomainSetup ShadowCopy functionality needs to be aware of that transitive list. xUnit's usage here. We've exhaustively explored workarounds to hack this in lieu of a .NET 4.6 feature addition, and the smallest change we can come up with is to add a read-only field to Assembly that provides the original (pre-shadow-copy) path of the assembly. Without that, there's no way to guess where the assembly came from, and where the missing files could be located. Ideally, of course, the runtime would be aware of this lockfile and handle the xplat differences for us. More detail about install/uninstall would be great. First-class native binary nuget packages would be helpful from a security point of view; wrappers wouldn't need to update for a security patch to be applied. |
There is no msbuild involvement. We run nuget restore before build in nuget 3. Try it out in the rc bits. I'll read about the other issues you mentioned. Paket is honestly a very limited view on what NuGet packages contain. And I can't speak to what its limitations are. I'll make sure that once command line bits are available we will reach out to you with more details. |
We are now building NuGet based on lock files and project.json (as you can see in all our client repos). The release is going to be a bit after RTM, but the bits are already somewhat working. We are not currently supporting pivots. But we are starting to finally working on documenting the process of what we got so far. |
So I don't think this is going to get fully resolved in the first release. I'm still leaving it at that milestone so we can keep it in the front, and turn this issue in a feedback/next steps kind of issues on top of that release |
@nathanaeljones here is a link to a package that addresses the native scenario - |
👍 for having npm inspired But
npm does not select native binary based on the os+architecture. Do you think it creates a package dynamically everytime you run npm install and make decisions which binary to pack? No. All packages using node-gyp for C++ binary dependency have this issue. From the link you provided, optionalDepenedency is something misleading in this regard. The os key is a flat filter, it just rejects the entire installation on non-permitted OS. Then how do they know which binary to keep?
The decision whether the binary is compatible maybe complicated than just matching the architecture. node.js depends on v8, which has its own versions and then there are module versions etc. ( -> Spoken from series of personal painful experiences with npm/C++/node-gyp. |
Pushing a package that will only work on Paket is definitely a worse approach than making something that works everywhere. Yes it is harder today, and perhaps the answer is to eventually port nuget 3 down to 2013. The answer today is to use targets for down level visual studio support like win2D package does. And as for implodes on any real-world project, I suggest filing issues, with as much detail as possible. And work through them together. The general statement above doesn't get us anywhere. |
NPM mostly delegates the step to node-gyp, which then handles OS dependencies and compiles the code itself using the system's compiler, unless a PyPI requires everything to be prebuilt, and uses setuptools to simplify and abstract the process. I think NuGet could support something similar. |
@IMPinball, like @nathanaeljones noted:
Here is a shining example: nodejs/node-gyp#629 with a glimpse of hope (in the form of Build Tools 2015 without VS). In npm's case, it is assumed that all Unix-like operating system are packed with C/C++ compilers, while on Windows only MSBuild is supported by node-gyp. The approach you mentioned about phantomJS (and node-sass) is good that prepare the binaries for as many OSs and arch as you can and for the remaining fallback to default node-gyp's compiler from source at installation time behavior, but at the same time, too much work for the maintainer to verify and test binaries on all possible OS and architectures. In my experience, most people are happy if they just do NuGet may experience a totally different trend of packaging stuff if this ongoing relevant discussion in CoreFX repo: https://github.com/dotnet/corefx/issues/2302#issuecomment-156132918 get popular. |
@jasonwilliams200OK Sorry to burst your bubble, but I don't think that idea on npm is getting very many places, as great as it sounds... |
Although I could see the benefits of .NET Core's approach. |
@IMPinball, i think you totally missed the point... but this is more or less what i am trying to say.. |
I'm not immediately invested in this, but this is the one thing that's stopping me from learning the platform where I use Linux (and there's a lot of .NET jobs where I live). |
I am very glad to see documented Runtime IDs have landed. I do not see that As discussed, it would be unexpected for the NuGet client to select a version based on what runtimes are supported by specific versions - rather, I would expect that the latest matching version number is used (based on semver evaluation), and no 'compatibility' checking is employed when selecting dependencies. If a runtime is missing for a specific version, restore fails, and the user must add a version specification. This ensures that there is no computational complexity in flattening transitive dependencies into a lockfile despite the presence of an additional runtime pivot. Separating native dependencies into packages by runtime enables a very easy deployment scenario Every Travis and AppVeyor task is only responsible for uploading their own runtime's package. The last task can upload an aggregate package which references them, after testing they are all retrievable (with a retry loop). |
Its been an year without any update here and not sure what was the last status since this is very long thread. So @nathanaeljones can you please check you if this still applicable for `PackageReference'? and if it does, then please open a new issue with relevant details so that we can take that forward. |
Originally posted on Paket, but since this is something NuGet is considering as well, I'm adding it here so we can track and (hopefully sync efforts). I hear that the ASP.NET team is doing something with native dependencies, but I can't find a spec for it. I suspect that a lockfile and transitive dependency support are pre-requisites for this.
A lot has been written about this (challenging) topic; so I'll link to what I've found (and please add more links in the comments).
I've been using a https download-during-boot approach for ImageResizer, but that is slow, unreliable, and annoying. I tried to create an example of how to build the ideal native/managed hybrid project, failed, then started a project to try to hot-fix the problem at runtime, and hit another series of roadblocks.
I've identified a few invalid assumptions that seem responsible for the current state of things.
Some of these are somewhat comical considering how easy it is to parse binaries for the major platforms and determine runtime compatibility.
Removing these assumptions, what new requirements are we left with?
a) We need to gather the referenced files (nuget references, mind you), and verify that the output folder does not have any conflicting named files. If there are conflicting names, the output folder version MUST be deleted, so that we can AssemblyResolve or LoadLibrary the correct version. We then copy each of the files to an appropriate subfolder of the output folder (or, if AnyCPU, the output folder itself). Since VisualStudio is blind to compatibility (by choice, one must assume), we may end up fighting with the build process a bit. Perhaps disabling copylocal? Another nice sanity check would be to simply parse the binary headers of everything in the output folder and ensure they are all able to run on a common environment.
So, I guess
Conditions:
Given that a fallback mode (building from source) is likely popular, we want to make it easy to ensure that only 1 reference from a conditional set is chosen. We should probably group them within another element or provide an id that prevents duplicates.
Target strings
Target string need to be as generic as can be permitted based on their restrictions.
/ - root is AnyCPU
/32b/ - managed, assumes 32-bit pointer, otherwise portable
/64b/ - managed, assumes 64-bit pointer, otherwise portable
/x86/winish/ - 32-bit, requires windows APIs.
pointer size, architecutre, and endianess are combined into the first string. Pointer size and endianess are only included if the architecture string doesn't make them redundant. I.e, we would see /ARM-little/ and /ARM-big/, but not /x86-little/.
The text was updated successfully, but these errors were encountered: