On Sun, Sep 23, 2018 at 03:04:58PM +0200, Lars Schneider wrote:
> Hi,
> 
> I recently had to purge files from large Git repos (many files, many commits).
> The usual recommendation is to use `git filter-branch --index-filter` to purge
> files. However, this is *very* slow for large repos (e.g. it takes 45min to
> remove the `builtin` directory from git core). I realized that I can remove
> files *way* faster by exporting the repo, removing the file references,
> and then importing the repo (see Perl script below, it takes ~30sec to remove
> the `builtin` directory from git core). Do you see any problem with this
> approach?

I don't know of any problems with this approach.  I didn't audit your
specific Perl script for any issues, though.

I suspect you're gaining speed mostly because you're running three
processes total instead of at least one process (sh) per commit.  So I
don't think there's anything that Git can do to make this faster on our
end without a redesign.
-- 
brian m. carlson: Houston, Texas, US
OpenPGP: https://keybase.io/bk2204

Attachment: signature.asc
Description: PGP signature

Reply via email to