Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
See #168.
I just went ahead and did it, I hope that's OK 😳. I'm just really excited to see you working on this again, and want to help where I can. If you had other ideas, I won't be offended if you want to close this and do it elsewhere.
Aside from adding StableRNGs, also in this PR (easy to revert if you'd prefer) -
REQUIRE
- you're not testing on 0.7, and it seems like it would be hard to maintain that much backward compat..travis.yml
since you've got GHA goingtest/Project.toml
so that tests can have their own environment, and you don't need to haveStableRNGs.jl
as a dependency in the main package itself.To double check: There was one failing test in
test/pca.jl
that was due to the random numbers, but also one where I had to adjust the tolerance onisapprox()
by a factor of 10 to make it pass (here). This doesn't seem like it should be related to the random number generation, but I don't know the method well enough to say.One other thought: Given the way that the tests are currently set up, I wonder if you'd like to use ReTest.jl. I ask because, given the structure of the tests, I'm assuming that during development you're doing
include(test/...)
in the REPL. This can cause issues with global variables sticking around, and it means you have egusing Statistics
over and over.With ReTest.jl, you can explicitly filter which test sets run, only re-run tests that have changed etc. I recently switched my packages over and it's pretty easy. I'd be happy to do that here or in another PR if you're interested.