Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example tests to run in parallel #639

Closed
avital opened this issue Nov 16, 2020 · 9 comments
Closed

Example tests to run in parallel #639

avital opened this issue Nov 16, 2020 · 9 comments
Labels
Priority: P2 - no schedule Best effort response and resolution. We have no plan to work on this at the moment. Status: pull requests welcome We agree with the direction proposed, feel free to give it a shot and file a pull request

Comments

@avital
Copy link
Contributor

avital commented Nov 16, 2020

Right now we run each example's tests sequentially in tests/run_all_tests.sh (this is due to pytest-dev/pytest#3151). Because of that, we can't run different example tests in parallel (though pytest can do this within a single run).

Consider either finding a way to work around the pytest "can't define two tests with the same filename in different directories", or if that's really not possible, at least try to find a way to rewrite tests/run_all_tests.sh to run in parallel but safely (e.g. can we run pytest multiple times -- does it keep any state directory that we need to isolate?)

@avital
Copy link
Contributor Author

avital commented Nov 16, 2020

/cc @Marvin182 @andsteing

@avital avital added Priority: P2 - no schedule Best effort response and resolution. We have no plan to work on this at the moment. Status: pull requests welcome We agree with the direction proposed, feel free to give it a shot and file a pull request Type: maintenance labels Nov 26, 2020
@avital
Copy link
Contributor Author

avital commented Nov 26, 2020

Marking as "pull requests welcome"

@andsteing
Copy link
Collaborator

Current timings from test when run on Github:

- flax/ in 61.14s (0:01:01)
- examples/pixelcnn in 10.56s
- examples/seq2seq in 20.55s
- examples/mnist/ in 9.11s
- examples/ppo/ in 20.88s
- examples/graph/ in 5.15s
- examples/sst2/ in 22.40s
- examples/lm1b/ in 42.18s
- examples/imagenet/ in 141.12s (0:02:21)
- linen_examples/pixelcnn/ in 30.65s
- linen_examples/wmt/ in 94.05s (0:01:34)
- linen_examples/seq2seq/ in 25.71s
- linen_examples/mnist/ in 10.00s
- linen_examples/nlp_seq/ in 2.28s
- linen_examples/imagenet/ in 330.18s (0:05:30)

@avital avital added this to the Core Team Productivity milestone Dec 12, 2020
@andsteing
Copy link
Collaborator

@avital is there a way to see Github VM CPU usage during the build workflow? How many cores do we have?

When running locally, expensive tests like wmt and imagenet that make most of the test time max out a single core, so parallelizing would only make sense if we have multiple cores on Github.

@avital
Copy link
Contributor Author

avital commented Dec 15, 2020

My guess is that the default is one core, but I'm not sure. Looks like we can use "self-hosted runners" if we want to.

@jheek knows more than me about GitHub actions, maybe he knows how we can find the VM specs for the free GitHub action build machine

@jheek
Copy link
Member

jheek commented Dec 16, 2020

The specs of the machine seem to be 2 Cores with 7Gb ram.
We could start using pytest -n auto to automatically parallelize to the number of CPUs available.

@jheek
Copy link
Member

jheek commented Dec 16, 2020

I also noticed we are spending most time compiling. I'll try disabling the heavy LLVM optimization in the longer tests

@jheek
Copy link
Member

jheek commented Dec 16, 2020

Filed #741 for disabling LLVM optimization in some tests

@marcvanzee marcvanzee removed this from the Core Team Productivity milestone Apr 28, 2022
@marcvanzee
Copy link
Collaborator

Closing this since I believe that after #2458 we run our tests in parallel. @cgarciae please let me know if you think I am wrong!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Priority: P2 - no schedule Best effort response and resolution. We have no plan to work on this at the moment. Status: pull requests welcome We agree with the direction proposed, feel free to give it a shot and file a pull request
Projects
None yet
Development

No branches or pull requests

4 participants