-
-
Notifications
You must be signed in to change notification settings - Fork 9.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parallelise test suite #1041
Parallelise test suite #1041
Conversation
a5b8d95
to
176c825
Compare
Nice work again @penman! Looks like https://travis-ci.org/Homebrew/brew/builds/161295814 shows a slight improvement over https://travis-ci.org/Homebrew/brew.
This is the only blocker on this, I think. Would be good to get this output a bit nicer if possible. |
Does |
This probably breaks coverage reporting, as I think the |
We can wait and see what the results say here when this CI job is finished. |
@MikeMcQuaid turns out I didn't need |
@penman Cool. Was just wondering if that made you see any more speedup after running a few times. |
I could get Minitest to print nothing but the dots, so no wordy output would get in the way. Would that be enough? |
@penman Sounds perfect. Would errors be handled nicely at the end? |
@MikeMcQuaid oh, uh, with that approach errors aren't reported at all. Perhaps I could get them reporting without all of Minitest's other reporting, but even with that I think they'd still spit out in the middle of the output when the process running them ended. I don't know that I can think of a good way to do this… |
Here's how they do it for RSpec: https://github.com/grosser/parallel_tests/blob/03d87ea93047e930296cf1e3549cb7bf76947619/lib/parallel_tests/rspec/logger_base.rb |
Oh, but there's the |
@penman That sounds like at least what we want for CI (e.g. if |
Aborted? Hmm… |
@penman Timed out after 45m. Let's try it again. |
Aborted again… |
It's hanging because of SimpleCov trying to filter a hash in (at least) O(n²). I'm going to submit a PR to them to make this more efficient. But, the reason this has suddenly become a problem is that we seem to have started doing the coverage for >10000 Ruby files. I can try to investigate this, but I'm not very familiar with SimpleCov so it might take a while. If anybody else wants to take a look, you can reproduce with brew cask-tests --coverage |
@penman Yeh, the performance with |
I don't start it until January 😞 |
GREEEEEEEEEN. I'm skeptical that code coverage has jumped 11% though, so I'd still like somebody who knows about that to take a look at it. |
Want to play around with this locally to investigate the speed differences. Thoughts on the simplecov issues? Thanks @penman! |
Will also play around a bit locally to see if I can get the simplecov issues sorted. |
Is the Sierra CI machine any different to the El Capitan / Yosemite ones? |
@penman It's running |
The test times are similar with and without coverage, with the exception of
|
|
||
# TODO: setting the --seed here is an ugly temporary hack, to remain only | ||
# until test-suite glitches are fixed. | ||
ENV["TESTOPTS"] = "--seed=14830" if ENV["TRAVIS"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can remove this as the cask tests are not run on Travis anymore anyway.
@penman Thanks for digging in there. This will be why it's so slow: I'm tempted to just 💀 the coverage on those commands until we can get the times more sane. |
@MikeMcQuaid What if I told you I could get them to run with coverage more than twice as fast as they do now? 😉 Feels like it should be a separate PR, though… |
Thanks for pointing to this Gemfile! Now I know why cask coverage is at 100 % at all times: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything seems to be working, except coverage, which I'll fix in a follow-up PR. So 👍 to merging this.
@penman Feel free to make it this PR! Still thinking about whether parallel tests provides a decent, measurable speedup? Your thoughts? |
@MikeMcQuaid already did! #1084 |
@MikeMcQuaid I think it's a good thing to have. I think (but haven't researched, this is a hunch) that we're not seeing a lot of the potential for speedup because parallel_tests will try to spread different tests across different processes, but different tests in the same file are always grouped together. When I look at the output, I get four processes. Three finish at the same time, and one goes on much longer, slowing things down. That final process is the one that runs |
Even without any further changes, |
Good enough for me! Nice work again @penman (I'm almost bored of saying that now 😆) |
Big time. On my ancient Core 2 Duo machine I get this: |
brew tests
with your changes locally?Integrates parallel_tests with
brew tests
andbrew cask-tests
, as suggested in #933.brew tests
now runs fromLibrary/Homebrew
, notLibrary/Homebrew/test
, because parallel_tests demands it be so. I had to modify one test to make this work.