You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Its pretty great that there is --benchmark-columns option which reports standard ‘min, max, mean, stddev, median, iqr, outliers, rounds, iterations’ but it would be nice if it can report any column it can find in the report generation.
Using pytest_benchmark_update_json hook, able to add custom output but on the console it does not have an option to display. Especially useful when --benchmark-compare is used, where it would be quite useful to display those custom columns. In our example we have product version and doing a comparison, it would be nice to display which product version the benchmark test has run against.
The text was updated successfully, but these errors were encountered:
Would be nice to be also able to simply enable/disable config from within pytest code.
Might be sufficient to have a simple hook or something like that, so users could provide a default set of report columns from inside their code, without having to pass an external command argument.
Its pretty great that there is --benchmark-columns option which reports standard ‘min, max, mean, stddev, median, iqr, outliers, rounds, iterations’ but it would be nice if it can report any column it can find in the report generation.
Using pytest_benchmark_update_json hook, able to add custom output but on the console it does not have an option to display. Especially useful when --benchmark-compare is used, where it would be quite useful to display those custom columns. In our example we have product version and doing a comparison, it would be nice to display which product version the benchmark test has run against.
The text was updated successfully, but these errors were encountered: