This is a fixed version of the initial patch, plus a two-patch
implementation of a recursion-free unpack_entry.  (I narrowly resisted
using "unrecursify" to describe it.)

I wrote:

> Junio C Hamano <> writes:
>> The stack/recursion is used _only_ for error recovery, no?  If we do
>> not care about retrying with a different copy of an object we find
>> in the delta chain, we can just update obj_offset with base_offset
>> and keep digging.  It almost makes me wonder if a logical follow-up
>> to this patch may be to do so, and rewrite the error recovery
>> codepath to just mark the bad copy and jump back to the very top,
>> retrying everything from scratch.
> I totally agree.  I'll try this again -- my last attempt just didn't
> work out...

Now I remember why it wasn't possible: we would have to go through the
blacklists at every iteration, whereas now we only check them in the
initial lookups.

I'm not sure it makes that much sense.  If we need to shave off some
speed in these functions, I would rather:

- write packed_object_info so that it runs with constant space, and
  restarts another implementation _with_ the recovery stack if it hits
  a problem;

- write unpack_entry so that it reuses the stack.  It can't do so by
  using a static buffer because it needs to be reentrant in the error

Thomas Rast (3):
  sha1_file: remove recursion in packed_object_info
  Refactor parts of in_delta_base_cache/cache_or_unpack_entry
  sha1_file: remove recursion in unpack_entry

 sha1_file.c | 411 +++++++++++++++++++++++++++++++++++++++---------------------
 1 file changed, 266 insertions(+), 145 deletions(-)


To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to
More majordomo info at

Reply via email to