You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We encountered junk data provided by retdec-fileinfo in PE rich header signature. E.g. for 2acd2ff9c70ba9398221cf2265b2fddaceae3e31a29883594bcce545f02be6a3:
Investigate the reasons and try to prevent providing such junk data. The solution would be to either fix a bug causing this (if there is a bug) or reliably detect such cases and prevent them. It is however quite possible that file offsets are in fact pointing at existing "junk" data, try to come up with some solution anyway - e.g. heuristics, sanity checks - analyze and discuss with @PeterMatula.
The text was updated successfully, but these errors were encountered:
I had a look at the sample and I think the Rich header analysis is wrong. I think the correct analysis would be: find the "Rich" that marks the end of the Rich header, grab an XOR key that follows the ending marker, and then continue to decrypt from the bottom up until you find decrypted "DanS" that marks the start/end of the Rich header. But currently, the analysis finds Rich and XOR key, but it decrypts everything in the space between DOS header/stub and ending "Rich" marker.
I've tried to make some quick implementation of this and it seems to give sane results without the junk. The current analysis returns the junk data even if it's not between the "DanS" and "Rich" markers. The code is a bit more complex and uncommented, so I'll take a deeper dive to confirm my suspicion.
We encountered junk data provided by
retdec-fileinfo
in PE rich header signature. E.g. for2acd2ff9c70ba9398221cf2265b2fddaceae3e31a29883594bcce545f02be6a3
:Samples:
Investigate the reasons and try to prevent providing such junk data. The solution would be to either fix a bug causing this (if there is a bug) or reliably detect such cases and prevent them. It is however quite possible that file offsets are in fact pointing at existing "junk" data, try to come up with some solution anyway - e.g. heuristics, sanity checks - analyze and discuss with @PeterMatula.
The text was updated successfully, but these errors were encountered: