[ANNOUNCE] git-sign, simple scripts to generate and verify securely signed Git checkouts
Hi Mostly as a proof of concept, I've created two scripts to sign and verify Git checkouts (I'm saying checkouts since it (both for simplicity, and probably trust) is based on the working directory contents, not the tree referred to by the signed commit). Like some other such solutions, this adds secure hashes to the signed tag message. There are two drawbacks and one advantage versus other solutions: - meant for small repositories only (each file in the repository takes up a line in the tag message) - relatively hacky, e.g. newlines in file names may be problematic, doesn't currently use gpg's --status-fd or --with-colons, and doesn't check git config + easily verifiable scripts, checking can even be done manually (hence no need for casual users to (blindly) trust third party code) https://github.com/pflanze/git-sign Christian.
Re: git gc --aggressive led to about 40 times slower git log --raw
2014-02-20 23:35 GMT+00:00 Duy Nguyen pclo...@gmail.com: does it make sense to apply --depth=250 for old commits only Just wondering: would it be difficult to fix the problems that lead to worse than linear slowdown with the --depth? (I.e. adaptive cache/hash table size.) If the performance difference between say --depth=25 and --depth=250 could be reduced from a factor 40 to 10 (or better if things are back to other things taking more time than the object access), that would seem like a nice gain in any case. Also, in man git-gc document --aggressive that it leads to slower *read* performance after the gc, I remember having red that option's docs when I ran it, and since it didn't mention that it makes reads slower, I didn't expect it to, and thus didn't remember this as the source of the problem when I noticed that things were slow. (But, I took from the discussion that increasing the gzip window size (?) would make things smaller anyway, so perhaps all that isn't even necessary?) I can test next week if you have particular suggestions to test. Christian. -- To unsubscribe from this list: send the line unsubscribe git in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: git gc --aggressive led to about 40 times slower git log --raw
2014-02-19 10:14 GMT+00:00 Duy Nguyen pclo...@gmail.com: Christian, if you want to experiment this, update MAX_DELTA_CACHE in sha1_file.c and rebuild. I don't have the time right now. (Perhaps next week?) -- To unsubscribe from this list: send the line unsubscribe git in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: git gc --aggressive led to about 40 times slower git log --raw
2014-02-18 9:45 GMT+00:00 Duy Nguyen pclo...@gmail.com: Christian can try git repack -adf That's what I already mentioned in my first mail is what I used to fix the problem. Here are some 'hard' numbers, FWIW: - both ~/scr and swap are on the same SSD; $ free total used free sharedbuffers cached Mem: 39967483800828 195920 0 3771761078848 -/+ buffers/cache:23448041651944 Swap: 2097148 1697601927388 git only used up to about 100 MB of VIRT or RSS when I checked, there was an ulimit of -S -v 120. - this is git version 1.7.10.4 (1:1.7.10.4-1+wheezy1 i386 Debian) - after my attempted merge (which had conflicts and I had then cancelled by way of git reset --hard), and then a git gc, the times were: ~/scr$ time git log --raw _THELOG real 3m7.002s user 2m0.252s sys 1m6.008s - on a copy: /dev/shm/scr$ time git repack -a -d -f Counting objects: 34917, done. Delta compression using up to 2 threads. Compressing objects: 100% (27038/27038), done. Writing objects: 100% (34917/34917), done. Total 34917 (delta 13928), reused 0 (delta 0) real 4m33.193s user 3m42.950s sys 1m13.821s /dev/shm/scr$ time git log --raw _THELOG2 real 0m8.276s user 0m7.192s sys 0m1.052s (not sure why it took 8s here, perhaps I had another process running at the same time? Compare with the 0m4.913s below.) /dev/shm/scr$ time g-gc --aggressive Counting objects: 36066, done. Delta compression using up to 2 threads. Compressing objects: 100% (27812/27812), done. Writing objects: 100% (36066/36066), done. Total 36066 (delta 14367), reused 21699 (delta 0) Checking connectivity: 36066, done. real 5m52.013s user 8m28.652s sys 1m4.308s /dev/shm/scr$ time git log --raw _THELOG2 real 1m34.430s user 0m47.291s sys 0m46.615s /dev/shm/scr$ time git repack -adf Counting objects: 36066, done. Delta compression using up to 2 threads. Compressing objects: 100% (27812/27812), done. Writing objects: 100% (36066/36066), done. Total 36066 (delta 14256), reused 21699 (delta 0) real 2m32.083s user 1m51.295s sys 1m4.940s /dev/shm/scr$ time git log --raw _THELOG3 real 0m4.913s user 0m3.944s sys 0m0.944s /dev/shm/scr$ du -s .git 43728 .git - back in the original place: ~/scr$ time git repack -a -d -f Counting objects: 36066, done. Delta compression using up to 2 threads. Compressing objects: 100% (27812/27812), done. Writing objects: 100% (36066/36066), done. Total 36066 (delta 14257), reused 21700 (delta 0) real 4m6.503s user 3m16.568s sys 1m11.640s ~/scr$ time git log --raw _THELOG2 real 0m5.002s user 0m4.032s sys 0m0.952s -- To unsubscribe from this list: send the line unsubscribe git in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
git gc --aggressive led to about 40 times slower git log --raw
Hi I've got a repository where git log --raw _somefile took a few seconds in the past, but after an attempt at merging some commits that were collected in a clone of the same repo that was created about a year ago, I noticed that this command was now taking 3 minutes 7 seconds. git gc, git fsck, git clone file:///the/repo/.git also now each took between ~4-10 minutes, also git log --raw somefile got equally unusably slow. With the help of the people on the IRC, I tracked it down to my recent use of git gc --aggressive in this repo. Running git repack -a -d -f solved it, now it's again taking 4-5 seconds. After running git gc --aggressive again for confirmation, git log --raw _somefile was again slowed down, although now 'only' to 1 minute 34 seconds; did perhaps my git remote add -f other-repo, which I remember was also running rather slowly, exacerbate the problem (to the 3 minutes I was seeing)? The repo has about 6000 commits, about 12'000 files in the current HEAD, and about 43 MB packed .git contents. The files are (almost) all plain text, about half of them are about 42 bytes long, the rest up to about 2 MB although most of them are just around 5-50 KB. Most files mostly grow at the end. The biggest files (500KB-2MB) are quite long-lived and don't stop growing, again mostly at the end. Also, about 2*5K files are each in the same directory, meaning that the tree objects representing those 2 directories are big but changing only in a few places. I've now learned to avoid git gc --aggressive. Perhaps there are some other conclusions to be drawn, I don't know. Christian. -- To unsubscribe from this list: send the line unsubscribe git in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html