Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not suggest usage of "--warmup" if already used #570

Closed
stephane-archer opened this issue Sep 22, 2022 · 6 comments
Closed

Do not suggest usage of "--warmup" if already used #570

stephane-archer opened this issue Sep 22, 2022 · 6 comments

Comments

@stephane-archer
Copy link

I ran the following command to compare the performance of these two programs

hyperfine --warmup 3 'fdupes -r EgyptPhotos/ ' 'jdupes -r EgyptPhotos/ ' | tee hyper.log

I was surprised to see the following warning: "Warning: The first benchmarking run for this command was significantly slower than the rest (2928.564 s). This could be caused by (filesystem) caches that were not filled until after the first run. You should consider using the '--warmup' option to fill those caches before the actual benchmark. Alternatively, use the '--prepare' option to clear the caches before each timing run."

Usually, I don't use tee and I sew the warmup run on the output but using tee the output is way less verbose.
Does the warmup option has been ignored for some reason, or does this warning appear even if the option is set but the first run is slower?

@sharkdp
Copy link
Owner

sharkdp commented Sep 22, 2022

Not sure I get your point.

Using tee should not affect how hyperfine performs the benchmark.

Usually, I don't use tee and I sew the warmup run on the output

What does this mean?

Does the warmup option has been ignored for some reason, or does this warning appear even if the option is set but the first run is slower?

Ah, hm. I haven't checked, but yeah: it might be the case that the warning can be shown even when --warmup is used, yes. I agree that the suggestions is a bit confusing in that case. That should be rephrased a bit.

I still think the warning is useful in general.

@stephane-archer
Copy link
Author

Usually, I don't use tee and I sew the warmup run on the output

usually, there is some spinning and output about what hyperfine is currently doing. So you can see it's doing the warmup, then it's doing the first test. You have information about the current average even if all the tests are not done. using tee I didn't see any of that. So I'm not 100% sure the warmup did happen even if I set the option.

I agree with you that knowing that the first run was slower even if you use --warmup is important. it's just because it's already set. you get confused at "it is properly set up?". a spelling mistake or argument order can matter and be silently ignored.

@sharkdp
Copy link
Owner

sharkdp commented Oct 13, 2022

I can not confirm this.

hyperfine --warmup 3 --runs 1 'sleep 1' | tee

clearly takes around four seconds, so warmup is being performed.

@sharkdp
Copy link
Owner

sharkdp commented Oct 13, 2022

Maybe try a longer warmup phase?

@stephane-archer
Copy link
Author

The title of the issue is actually incorrect now because I understand better the problem.
Just read again the conversation it will refresh your memories.
The warning says "You should consider using the '--warmup' option to fill those caches before the actual benchmark. Alternatively, use the '--prepare' option to clear the caches before each timing run."
But the --warmup option was set. So the warning is confusing. But having the information that the first run was slower is important. So, put a warning without the mention of --warmup can be a good solution.
Is it clear for you what I mean?

@sharkdp sharkdp changed the title Does the warmup option ignore when using tee? Do not suggest usage of "--warmup" if already used Nov 20, 2022
@stephane-archer
Copy link
Author

Thank you, you are the best

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants