The repository growing fast, things get harder . Now the size reach several GB,
it may possible be TB, YB.
When then, How do we handle this?
If the transfer broken, and it can not be resume transfer, waste time and waste
Git should be better support resume transfer.
It now seems not doing better it’s job.
Share code, manage code, transfer code, what would it be a VCS we imagine it ?
On Nov 28, 2013, at 4:50 PM, Duy Nguyen <pclo...@gmail.com> wrote:
> On Thu, Nov 28, 2013 at 3:35 PM, Karsten Blees <karsten.bl...@gmail.com>
>> Or simply download the individual files (via ftp/http) and clone locally:
>>> wget -r ftp://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/
>>> git clone git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git
>>> cd linux
>>> git remote set-url origin
> Yeah I didn't realize it is published over dumb http too. You may need
> to be careful with this though because it's not atomic and you may get
> refs that point nowhere because you're already done with "pack"
> directory when you come to fetcing "refs" and did not see new packs...
> If dumb commit walker supports resume (I don't know) then it'll be
> safer to do
> git clone http://git.kernel.org/....
> If it does not support resume, I don't think it's hard to do.
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html