Like many dev shops, we run a CI server that basically does:

  git fetch $some_branch &&
  git checkout $some_branch &&
  make test

all day long. Sometimes the fetches would get very slow, and the problem
turned out to be a combination of:

  1. Never running "git gc". This means you can end up with a ton of
     loose objects, or even a bunch of small packs[1].

  2. One of the loops in fetch caused us to re-scan the entire
     objects/pack directory repeatedly, proportional to the number of
     refs on the remote.

I think the fundamental fix is to gc more often, as it makes the re-scans
less expensive, along with making general object lookup faster. But the
repeated re-scans strike me as kind of hacky. This series tries to
address both:

  [1/2]: fetch: run gc --auto after fetching
  [2/2]: fetch-pack: avoid repeatedly re-scanning pack directory

-Peff

[1] It turns out we had our transfer.unpacklimit set unreasonably low,
    leading to a large number of tiny packs, but even with the defaults,
    you will end up with a ton of loose objects if you do repeated small
    fetches.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to