Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skip logging when benchmark is disabled #199

Closed
Spectre5 opened this issue Apr 1, 2021 · 3 comments
Closed

Skip logging when benchmark is disabled #199

Spectre5 opened this issue Apr 1, 2021 · 3 comments

Comments

@Spectre5
Copy link

Spectre5 commented Apr 1, 2021

I have a library where I normally disable the benchmarks and only run them when editing certain files. However, I still get the warning log messages from pytest-benchmark in this case. Can we disable logging when the benchmarks are disabled?

Another question - the last log message is always displayed on the same line as the last test for me. See the [100%]/home.... pytest apparently doesn't include a final newline there. Is there an easy way to avoid this so that the log message goes to the next line?

$ pytest -v --benchmark-disable
/home/.../site-packages/pytest_benchmark/logger.py:44: PytestBenchmarkWarning: Can't compare. No benchmark files in '/home/.../benchmarks'. Can't load the previous benchmark.
  warner(PytestBenchmarkWarning(text))
====================================== test session starts ======================================
platform linux -- Python 3.9.2, pytest-6.2.2, py-1.10.0, pluggy-0.13.1 -- /home/.../.venv/bin/python
cachedir: .cache/pytest
benchmark: 3.2.3 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/..., configfile: pyproject.toml
plugins: benchmark-3.2.3
collected 2 items                                                                               

tests/test1.py::test_method1 PASSED      ......      [ 50%]
tests/test1.py::test_method1 PASSED      ......      [100%]/home/.../site-packages/pytest_benchmark/logger.py:44: PytestBenchmarkWarning: Not saving anything, no benchmarks have been run!
  warner(PytestBenchmarkWarning(text))
@ionelmc
Copy link
Owner

ionelmc commented Apr 1, 2021

I would like to see what addopts you have in pytest.ini/tox.ini/setup.cfg.

@Spectre5
Copy link
Author

Spectre5 commented Apr 1, 2021

Of course I should have already provided this. Here are all of my pytest options in my pyproject.toml file.

[tool.pytest]
[tool.pytest.ini_options]
minversion = '6.2'
cache_dir = '.cache/pytest'
addopts = '''
-x
--strict-markers
--benchmark-disable
--benchmark-storage=.cache/benchmarks
--benchmark-autosave
--benchmark-compare
--benchmark-group-by=name,fullname
'''
markers = [
    "slow: marks tests as slow (deselect with '-m \"not slow\"')",
]
norecursedirs = '''
docs/_build
docs/examples
'''
doctest_optionflags = 'ELLIPSIS NORMALIZE_WHITESPACE IGNORE_EXCEPTION_DETAIL'

I also use poetry, so with these I actually call pytest with poetry run pytest -v. When I want to run the benchmarks, I run poetry run pytest -v --benchmark-enable.

@ssbarnea
Copy link

ssbarnea commented Apr 13, 2021

I am looking for an option to disable the report completely or at least when the test is passing but I was not able to find any option to do that. Note that I do not want to disable the benchmarking, but I want to display nothing it it does pass. I already added code to fail the benchmarked test if it takes too much time. Any hints? thanks.

I can give an example where coverage module has an option --no-cov-on-fail which disable coverage report if testing failed, just in order to avoid producing noise to for the user reading the console. That is mostly the same: avoid displaying anything unless there something went outside our expectations.

ionelmc added a commit that referenced this issue Apr 17, 2021
TauPan added a commit to TauPan/pytest-benchmark that referenced this issue Jun 11, 2021
TauPan added a commit to TauPan/pytest-benchmark that referenced this issue Jun 11, 2021
ionelmc pushed a commit that referenced this issue Jun 14, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants