-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Report progress only if "--benchmark-verbose" is set #149
Report progress only if "--benchmark-verbose" is set #149
Conversation
Codecov Report
@@ Coverage Diff @@
## master #149 +/- ##
==========================================
- Coverage 46.26% 45.96% -0.31%
==========================================
Files 16 16
Lines 1753 1758 +5
Branches 312 312
==========================================
- Hits 811 808 -3
- Misses 867 875 +8
Partials 75 75
Continue to review full report at Codecov.
|
bfeba03
to
68bbe78
Compare
@ionelmc Do you have a moment to review this PR? I'm not able to see the Travis CI logs for the error message blocking checks from passing. It would be wonderful to remove those progress reports from the default verbosity. It takes forever to scroll through them for the error messages on failing tests in TDD. |
@@ -22,7 +26,7 @@ def __init__(self, columns, sort, histogram, name_format, logger, scale_unit): | |||
|
|||
def display(self, tr, groups, progress_reporter=report_progress): | |||
tr.write_line("") | |||
tr.rewrite("Computing stats ...", black=True, bold=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this should be removed - it's just one message and it give important feedback (heavy work is going on).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe the issue here is that the feedback is printed for every iteration of parametrized benchmarks. I currently have my tests running with --benchmark-disable
because these lines separate my test results from my stats by several pages' worth of scrolling. IMO, this feedback should only be output under --benchmark-verbose
, which is what @dimrozakis 's PR is trying to achieve.
The heavy work being summarized is still work being shown. Standard output of pytest shows a few brief lines of setup, pass/fail/skipped results, & failure summaries. Moving the lengthy block of "Computing stats ... group x/y: stat (z/t)" to verbose would make the standard output more helpful as a user. Unfortunately, it is quite repetitive, especially when viewed as part of a static summary (aka pytest output piped to a file).
I'm sure the line-by-line stats computation message is very useful in a troubleshooting situation! Since I have not run into issues where it has been necessary yet, I feel like being able to turn on/off the feedback would be extremely helpful for others with my experience. The benchmark statistics themselves have been wonderful!
src/pytest_benchmark/table.py
Outdated
@@ -86,7 +90,7 @@ def display(self, tr, groups, progress_reporter=report_progress): | |||
) | |||
for prop in self.columns | |||
) | |||
tr.rewrite("") | |||
_progress_rewrite(progress_reporter, tr, "") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess I have to try it, I'm not really sure that this is really necessary.
The CI is basically passing, there have been some runs that timed out from other reasons, that's why it's red. |
68bbe78
to
3c3f1cc
Compare
So that if `report_noprogress` is used, no rewrites are actually performed.
4b2d7fb
to
527ceae
Compare
glad to see this getting merged, thanks @ionelmc @mUtterberg |
Related issues: