[git-users] Re: Measures to take regarding large binary files in a Git repository

2014-08-25 Thread fabian . schmied


 I think you should bring this to the Git development mailing list, as the 
 technical depth is a bit beyond what we usually discuss on this list.


Thank you, I'm now trying on the Git for Windows list, they might have 
some experience with this topic there.

https://groups.google.com/d/topic/msysgit/ckdNSu_-w2w/discussion

On Wednesday, August 20, 2014 11:31:21 PM UTC+2, Thomas Ferris Nicolaisen 
wrote:

 On Monday, August 18, 2014 9:31:23 AM UTC+2, fabian@gmail.com wrote:

 1. Have I got everything right in my analysis above? Am I missing 
 anything important, any problems I should expect?
 2. Would you recommend setting core.bigFileThreshold, pack.packSizeLimit 
 or other options to non-default values proactively on all clients, or 
 should I rather postpone this until (if ever) we're experiencing problems? 
 If I don't set these values proactively, is there a chance that the Git 
 repository could be ruined?
 -- What is a good value for core.bigFileThreshold, given my concrete 
 binary files of 10 to 400 MB, some of which have up to 17 revisions?
 -- What is a good value for pack.packSizeLimit? Git for Windows defaults 
 it to 2g, is there any reason not to leave it at that?
 3. Since pack.packSizeLimit does not affect the packs created for pulling 
 and pushing - what problems can I expect there? How could I tackle them?
 4. git repack -afd and git gc currently fail with an out of memory 
 error on the migrated repository [1][2]. Should I worry about this?
 -- I can make git repack -afd work by passing --window-memory 750m to 
 the command. After that, git gc works fine again) Again, is setting 
 pack.windowMemory to 750m something I should do proactively?



 I think you should bring this to the Git development mailing list, as the 
 technical depth is a bit beyond what we usually discuss on this list.


-- 
You received this message because you are subscribed to the Google Groups Git 
for human beings group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to git-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[git-users] Re: Measures to take regarding large binary files in a Git repository

2014-08-20 Thread Thomas Ferris Nicolaisen
On Monday, August 18, 2014 9:31:23 AM UTC+2, fabian@gmail.com wrote:

 1. Have I got everything right in my analysis above? Am I missing anything 
 important, any problems I should expect?
 2. Would you recommend setting core.bigFileThreshold, pack.packSizeLimit 
 or other options to non-default values proactively on all clients, or 
 should I rather postpone this until (if ever) we're experiencing problems? 
 If I don't set these values proactively, is there a chance that the Git 
 repository could be ruined?
 -- What is a good value for core.bigFileThreshold, given my concrete 
 binary files of 10 to 400 MB, some of which have up to 17 revisions?
 -- What is a good value for pack.packSizeLimit? Git for Windows defaults 
 it to 2g, is there any reason not to leave it at that?
 3. Since pack.packSizeLimit does not affect the packs created for pulling 
 and pushing - what problems can I expect there? How could I tackle them?
 4. git repack -afd and git gc currently fail with an out of memory 
 error on the migrated repository [1][2]. Should I worry about this?
 -- I can make git repack -afd work by passing --window-memory 750m to 
 the command. After that, git gc works fine again) Again, is setting 
 pack.windowMemory to 750m something I should do proactively?



I think you should bring this to the Git development mailing list, as the 
technical depth is a bit beyond what we usually discuss on this list.

-- 
You received this message because you are subscribed to the Google Groups Git 
for human beings group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to git-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.