Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow to configure custom columns or output reported #94

Open
srids opened this issue Nov 17, 2017 · 2 comments
Open

Allow to configure custom columns or output reported #94

srids opened this issue Nov 17, 2017 · 2 comments
Milestone

Comments

@srids
Copy link

srids commented Nov 17, 2017

Its pretty great that there is --benchmark-columns option which reports standard ‘min, max, mean, stddev, median, iqr, outliers, rounds, iterations’ but it would be nice if it can report any column it can find in the report generation.
Using pytest_benchmark_update_json hook, able to add custom output but on the console it does not have an option to display. Especially useful when --benchmark-compare is used, where it would be quite useful to display those custom columns. In our example we have product version and doing a comparison, it would be nice to display which product version the benchmark test has run against.

@ionelmc
Copy link
Owner

ionelmc commented Nov 19, 2017

Dully noted. I'll think of a way to deal with this (also related to #93).

@ionelmc ionelmc added this to the v4.0.0 milestone Jan 2, 2019
@aldanor
Copy link

aldanor commented Dec 11, 2019

Would be nice to be also able to simply enable/disable config from within pytest code.

Might be sufficient to have a simple hook or something like that, so users could provide a default set of report columns from inside their code, without having to pass an external command argument.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants