You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We encountered junk data provided by retdec-fileinfo in PE resource table, namely in resource's type value. E.g. for 097a0f8b8c3f2b90f5360f27da5ef8e5a7406c6f211c8d3e56a1671933508bc1:
i nameId type typeId language lanId slanId offset size crc32
-----------------------------------------------------------------------------------------------------------
0 105 \x03\x17\xb2\x04\xd7\x17\xddfF&\xf27u\x16P\xfb\x0d\x9d\xc8\x0a\xbf\xec\x06\xbb\xb9\xae\x97\x03p...
Investigate the reasons and try to prevent providing such junk data. The solution would be to either fix a bug causing this (if there is a bug) or reliably detect such cases and prevent them. It is however quite possible that file offsets are in fact pointing at existing "junk" data, try to come up with some solution anyway - e.g. heuristics, sanity checks - analyze and discuss with @PeterMatula.
The text was updated successfully, but these errors were encountered:
I looked into this, seems like the directory type name points to junk data. I am not sure if there is a valid reason for it, seems like a simple sanity check on string length could be a viable solution (for example allow only strings that are at most 100 characters long - I wasn't sure with the ideal allowed string length as I am not that familiar with PE Resources, but this particular sanity check is what LIEF does as well and it uses 100 characters cap https://github.com/lief-project/LIEF/blob/d8bca167f81eb588c82c8a9e51d0bf267cd627e3/src/PE/Parser.cpp#L414)
We encountered junk data provided by
retdec-fileinfo
in PE resource table, namely in resource'stype
value. E.g. for097a0f8b8c3f2b90f5360f27da5ef8e5a7406c6f211c8d3e56a1671933508bc1
:Samples:
Investigate the reasons and try to prevent providing such junk data. The solution would be to either fix a bug causing this (if there is a bug) or reliably detect such cases and prevent them. It is however quite possible that file offsets are in fact pointing at existing "junk" data, try to come up with some solution anyway - e.g. heuristics, sanity checks - analyze and discuss with @PeterMatula.
The text was updated successfully, but these errors were encountered: