-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
binary cache 404 in dependency fetch causes loop #245
Comments
This can be reproduced with Nix only, so it's probably a bug in Nix's goal state machine in
|
Did you report it in nix? We saw this behavior in our CI as well on agent version 0.7.3. Also, how do we work around it? |
Here's the Nix issue NixOS/nix#3964. So far I've assumed it was a one-off, but that doesn't seem to be the case. To work around the issue, you could build it manually without the "broken" cache.
If you have a single agent per architecture, you could run it there and have the agent pick up the output when you click Rebuild. |
@jappeace Could you provide the derivation path? You can send it to [email protected] if you prefer. |
I send an email 👌 |
Update: Domen has improved Cachix to better avoid this bug. I'm closing this in favor of NixOS/nix#3964 but feel free to comment or contact support if this recurs. |
Description
Log shows the same two dependencies being fetched over and over.
Specifically, the subtree of a dependent of the missing path.
P -> A -> B
x -> y
: x needs yP
: what the agent is trying to buildA
: a path that is in a binary cacheB
: a path that is not in any binary cacheIt will keep trying to fetch
A
andB
.I'm investigating whether this could be related to broken C++ exception handling https://gitlab.haskell.org/ghc/ghc/-/issues/11829Agent 0.6 may be unaffected but other fixes have not been backported there and it does not have live logs.To Reproduce
Expected behavior
Exception is caught, dependency derivation is built as fallback.
Logs
Only
fv3...
exists, in the private cache.Platform / Version
darwin, 0.7.4
The text was updated successfully, but these errors were encountered: