-
Notifications
You must be signed in to change notification settings - Fork 877
Dirty tree on deployed computers #48
Comments
With the exception of |
What is the benefit of storing github downloads in a git repo? The only |
I would also vote for ignoring the puppet cache. Storing the tar files in the repo makes little sense to me. As for the Puppetfile.lock you should run |
Ignore whatever you want, but look at some actual disk space usage before you do. We've found that checking in dependencies makes for faster, less error-prone deploys. @mmuehlberger Don't bother with |
I'm closing this because it's template opinion, not a bug. |
@roderik To answer your query directly, it saves every person running your boxen from making N API calls to GitHub.com to download N tarballs any time N modules change in the Puppetfile. This turns out to be quite a savings in runtime of Boxen. Most of the tarballs are around 4-8K in size, and git actually does a solid job of not blowing up the repository size (GitHub's boxen, which has had numerous module updates, clocks in at less than 1.2MB on our file servers after around 6 months or so). |
.projects should be ignored by default based on boxen/our-boxen#48 (comment)
.projects should be ignored by default based on boxen/our-boxen#48 (comment)
.projects should be ignored by default based on boxen/our-boxen#48 (comment)
I'm trying out boxen on a clean VM and every time i add something to my-boxen, i rerun boxen --debug on the VM.
I've run into a lot of dirty tree errors so it wouldn't update. Since i know for sure i didn't do anything on the VM except run boxen this poses a problem.
I fixed it by removing a lot of stuff from the git repo and ignoring them.
Is this the way to go or am i just missing something?
The text was updated successfully, but these errors were encountered: