-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add test to see whether all Python files can be imported. #343
Conversation
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## dev_master #343 +/- ##
==============================================
- Coverage 77.04% 75.52% -1.52%
==============================================
Files 57 59 +2
Lines 7693 7870 +177
==============================================
+ Hits 5927 5944 +17
- Misses 1766 1926 +160 ☔ View full report in Codecov by Sentry. |
Oh, this change might mess things up, because it imports
and thereby might make the tests order dependent etc. Perhaps everything would work fine if all the other tests are properly mocked, but still. I propose to simply remove that file, because it doesn't really do anything that the IRDB is not also doing. It does profiling, but we can do that also in the IRDB, and perhaps in a generalized way. So I'll do just that, remove the file. |
The only useful thing in there is profiling; we should do that in a more generic way. The test itself is covered by the IRDB.
The one now failing test is again a 403. Which lead me to the discovery that the new "Tests with updated dependencies" uses |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Certainly a useful addition 👍
I noticed that still not all Python files were covered, so redid the logic once more. This imported the broken profiling tests. Instead of fixing them, I decided to just delete them. We might need some profiling tests, but we should do that in a more generalized way. And then run them nightly on zeus or something and make plots of the timings over time. We can easily recreate the deleted file, it is a trivial setup. Furthermore, I decided to add a time limit on importing the modules, of 200 milliseconds. This showed that Could you perhaps review this again @teutoburg? Sorry for the rerequest; I realized too late it was still not covering everything the previous time. |
@pytest.fixture(scope="class") | ||
def all_pkg(): | ||
# TODO: This could use some kind of mock to avoid server access | ||
yield db.get_all_package_versions() | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I think I wanted to do somethings like that at some point, but then forgot 👍
Great, I already tried running the tests (not sure now if in scopesim itself or maybe ti was spextra) with |
This ensures that all files are included in the code coverage.
See #340 (comment)
The file mentioned there, https://github.com/AstarVienna/ScopeSim/blob/dev_master/scopesim/detector/nghxrg.py , is itself not broken (this test passes), but the detector package is:
Tests fail because now the coverage is down, which makes sense, because previously uncovered files are added. Turns out it was only
scopesim/detector/nghxrg.py
andscopesim/reports/report_generator.py
though.Needed to make some minor changes to get the test to pass.