The repository growing fast, things get harder . Now the size reach several GB, 
it may possible be TB, YB.
When then, How do we handle this?
If the transfer broken, and it can not be resume transfer, waste time and waste 

Git should be better support resume transfer.
It now seems not doing better it’s job.
Share code, manage code, transfer code, what would it be a VCS we imagine it ?
zhifeng hu 

On Nov 28, 2013, at 4:50 PM, Duy Nguyen <> wrote:

> On Thu, Nov 28, 2013 at 3:35 PM, Karsten Blees <> 
> wrote:
>> Or simply download the individual files (via ftp/http) and clone locally:
>>> wget -r
>>> git clone
>>> cd linux
>>> git remote set-url origin 
>>> git://
> Yeah I didn't realize it is published over dumb http too. You may need
> to be careful with this though because it's not atomic and you may get
> refs that point nowhere because you're already done with "pack"
> directory when you come to fetcing "refs" and did not see new packs...
> If dumb commit walker supports resume (I don't know) then it'll be
> safer to do
> git clone
> If it does not support resume, I don't think it's hard to do.
> -- 
> Duy

To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to
More majordomo info at

Reply via email to