On Tue, Mar 01, 2016 at 05:40:28PM -0800, Stefan Beller wrote:

> So throwing away half finished stuff while keeping the front load?

Throw away the object that got truncated and ones for which delta chain
doesn't resolve entirely in the transferred part.
 
> > indexing the objects it
> > contains, and then re-running clone and not having to fetch those
> > objects.
> 
> The pack is not deterministic for a given repository. When creating
> the pack, you may encounter races between threads, such that the order
> in a pack differs.

FWIW, I wasn't proposing to recreate the remaining bits of that _pack_;
just do the normal pull with one addition: start with sending the list
of sha1 of objects you are about to send and let the recepient reply
with "I already have <set of sha1>, don't bother with those".  And exclude
those from the transfer.  Encoding for the set being available is an
interesting variable here - might be plain list of sha1, might be its
complement ("I want the following subset"), might be "145th to 1029th,
1517th and 1890th to 1920th of the list you've sent"; which form ends
up more efficient needs to be found experimentally...

IIRC, the objection had been that the organisation of the pack will lead
to many cases when deltas are transferred *first*, with base object not
getting there prior to disconnect.  I suspect that fraction of the objects
getting through would still be worth it, but I hadn't experimented enough
to be able to tell...

I was more interested in resumable _pull_, with restarted clone treated as
special case of that.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to