Am 07.06.2013 22:21, schrieb Constantine A. Murenin:
I'm totally fine with daily updates; but I think there still has to be
some better way of doing this than wasting 0.5s of CPU time and 5s of
HDD time (if completely cold) for each blame / log, at the price of
more storage and some pre-caching, and (daily (in my use-case))
fine-grained incremental updates.

To get a feel for the numbers: I would guess 'git blame' is mostly run against the newest version and the release version of a file, right? I couldn't find the number of files in bsd, so lets take linux instead: That is 25k files for version 2.6.27. Lets say 35k files altogether for both release and newer versions of the files.

A typical page of git blame output on github seems to be in the vicinity of 500 kbytes, but that seems to include lots of overhead for comfort functions. At least that means it is a good upper bound value.

35k files times 500k gives 17.5 Gbytes, a trivial value for a static *disk* based cache. It is also a manageable value for affordable SSDs

To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to
More majordomo info at

Reply via email to