The multi-pack-index provides a fast way to find an object among a large
list of pack-files. It stores a single pack-reference for each object id, so
duplicate objects are ignored. Among a list of pack-files storing the same
object, the most-recently modified one is used.
Create new verbs for the multi-pack-index builtin.
* 'git multi-pack-index expire': If we have a pack-file indexed by the
multi-pack-index, but all objects in that pack are duplicated in
more-recently modified packs, then delete that pack (and any others like
it). Delete the reference to that pack in the multi-pack-index.
* 'git multi-pack-index repack --batch-size=': Starting from the oldest
pack-files covered by the multi-pack-index, find those whose on-disk size
is below the batch size until we have a collection of packs whose sizes
add up to the batch size. Create a new pack containing all objects that
the multi-pack-index references to those packs.
This allows us to create a new pattern for repacking objects: run 'repack'.
After enough time has passed that all Git commands that started before the
last 'repack' are finished, run 'expire' again. This approach has some
advantages over the existing "repack everything" model:
1. Incremental. We can repack a small batch of objects at a time, instead
of repacking all reachable objects. We can also limit ourselves to the
objects that do not appear in newer pack-files.
2. Highly Available. By adding a new pack-file (and not deleting the old
pack-files) we do not interrupt concurrent Git commands, and do not
suffer performance degradation. By expiring only pack-files that have no
referenced objects, we know that Git commands that are doing normal
object lookups* will not be interrupted.
3. Note: if someone concurrently runs a Git command that uses
get_all_packs(), then that command could try to read the pack-files and
pack-indexes that we are deleting during an expire command. Such
commands are usually related to object maintenance (i.e. fsck, gc,
pack-objects) or are related to less-often-used features (i.e.
fast-import, http-backend, server-info).
We plan to use this approach in VFS for Git to do background maintenance of
the "shared object cache" which is a Git alternate directory filled with
packfiles containing commits and trees. We currently download pack-files on
an hourly basis to keep up-to-date with the central server. The cache
servers supply packs on an hourly and daily basis, so most of the hourly
packs become useless after a new daily pack is downloaded. The 'expire'
command would clear out most of those packs, but many will still remain with
fewer than 100 objects remaining. The 'repack' command (with a batch size of
1-3gb, probably) can condense the remaining packs in commands that run for
1-3 min at a time. Since the daily packs range from 100-250mb, we will also
combine and condense those packs.
Thanks, -Stolee
Derrick Stolee (5):
multi-pack-index: prepare for 'expire' verb
midx: refactor permutation logic
multi-pack-index: implement 'expire' verb
multi-pack-index: prepare 'repack' verb
midx: implement midx_repack()
Documentation/git-multi-pack-index.txt | 20 +++
builtin/multi-pack-index.c | 12 +-
midx.c | 222 +++++++++++++++++++++++--
midx.h | 2 +
t/t5319-multi-pack-index.sh | 98 +++++++++++
5 files changed, 343 insertions(+), 11 deletions(-)
base-commit: 26aa9fc81d4c7f6c3b456a29da0b7ec72e5c6595
Published-As:
https://github.com/gitgitgadget/git/releases/tags/pr-92%2Fderrickstolee%2Fmidx-expire%2Fupstream-v1
Fetch-It-Via: git fetch https://github.com/gitgitgadget/git
pr-92/derrickstolee/midx-expire/upstream-v1
Pull-Request: https://github.com/gitgitgadget/git/pull/92
--
gitgitgadget