-
-
Notifications
You must be signed in to change notification settings - Fork 237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Beat xxd #66
Comments
Thank you for the feedback. I agree, it would be nice. But not more. I don't really see a problem with the current speed, as I don't think that performance (at the current level) is critical for a hexdump tool. In which real-world use case would we really need it to be faster? |
I tried to find a "real-world use case" but failed. I would say it is an ideological thing... Something in lines "software shouldn't get slower with time, but faster". I'll take a look into the source code in my free time, maybe (though unlikely) I'll find the way to improve it :) |
I would agree. But |
Real-world use case -- I've got a pretty big file that's mostly zeroes, with a k or so of nonzero data. Reading from /tmp, |
@remexre Thank you. If someone wants to work on this, here is a reproducible benchmark (I'm using #!/bin/bash
dd if=/dev/zero bs=10M count=1 > data
dd if=/dev/urandom bs=1k count=1 >> data
hyperfine --warmup 3 \
'hexyl data' \
'hexyl --no-squeezing data' \
'hexdump -C data' \
'hexdump -C --no-squeezing data' \
'xxd data' \
--export-markdown results.md
Apparently, |
see #73 |
Old commit, but here's another use case for posterity. I want to compare two disk images, and I want to not only see where data differs, but also what the differing data is, in a hexdump format. To do that, I like using tools like this to produce a plaintext version of the data that can then be |
I did a bit of benchmarking and I can't help but notice that
xxd
is faster thanhexyl
.On my machine on a file of about 700M:
It would be nice to beat
xxd
in speed... I got no idea how to do it though.The text was updated successfully, but these errors were encountered: