Jeff King <p...@peff.net> writes:
> That being said, the parse_sha1_header() function clearly does not
> detect overflow at all when parsing the size. So on a 32-bit system, you
> end up with:
> $ git fsck
> fatal: Out of memory, malloc failed (tried to allocate 4294967141 bytes)
> which is not correct, but I'm not sure it's a security problem. Integer
> overflows are an issue if they cause us to under-allocate, and then to
> write more bytes than we allocated. In this case, I would expect
> unpack_sha1_rest() to never write more bytes than the "size" we parsed
> and allocated (and to complain if the number of bytes we get from the
> zlib sequence do not exactly match the claimed size).
> So a more interesting example is more like "ULONG_MAX + 5", where we
> would overflow to 5 bytes. And we'd hope that unpack_sha1_rest does not
> ever write more than 5 bytes. From my reading and a few tests with gdb,
> it does not. However, it also does not notice that there were more bytes
> that we didn't use.
> So I think there's room for improved diagnosis of bogus situations
> (including integer overflows), but I don't see any actual security bugs.
I agree with the overall conclusion. This does look like an attempt
to throw random fuzz at Git and see if and how it breaks, and in this
particular one Git is simply doing the right thing (and the fault lies
in the way how ASAN is used and how its result was interpreted).
Throwing random fuzz to see what breaks is not a bad thing to do
per-se, but anybody who does so without wearing a black hat needs to
keep two things in mind:
* When a random fuzz attempt does uncover a security issue,
reporting it here on this list is a grossly irresponsible way to
disclose the issue. We have the git-security list for that.
* A random fuzz may stop Git and that may be perfectly legit thing
to happen, e.g. the data may request a large but still valid
amount of memory to be allocated that happens not to fit in the
hardware the fuzz attempt is being run, and xmalloc() may detect
the situation and die, like the above example. False positives
are expected and you want to make sure you cull them before
making your reports. Otherwise, they will unnecessary burden
people who are doing real work, i.e. reproduce and correct
problems that may be security related that are irresponsibly
disclosed here quickly enough to minimize damage.