On Wed, Jun 25, 2008 at 04:04:08PM -0400, Phillip Susi wrote: > Zenaan Harkness wrote: >> Sorry, what I mean is, separate files, not tarred and gzipped.
> This would mean that the server would have to decompress the pack file, > undelitfy the file, recompress it, and transmit it to the client, for every > file in every package. That load would be several orders of magnitude > higher than current. Are git servers experiencing high loads relative to the volume of data sent? I thought they were pretty efficient. It is entirely possible that the idea is completely brain dead and I have absolutely no idea. Apologies if I'm barking up the wrong tree... >>> It would not be reduced by much and would have a tremendous overhead to >>> access as a result, which the mirrors could never handle, and it would >>> break backwards compatibility since http or ftp could no longer be used >>> to fetch packages. >> With an appropriate git config/setup, I'm pretty sure http and ftp >> access is just fine, (my knowledge is relatively limited on this subject >> though). > > For http/ftp access to still work, the repository must not be stored in > packed+pruned form, which negates the space savings you are interested in, > as well as only allowing access to the current version. In fact, it would > use a _lot_ more space than it does now. I was assuming possibly some modifications to underlying tools/ transports. But again, I have no idea if what I had in mind is at all possible. Sounds like it's not possible at all, and that I am clearly misunderstanding stuff. I'm still not getting it. Hope I haven't wasted too much time... zen -- Homepage: www.SoulSound.net -- Free Australia: www.UPMART.org Please respect the confidentiality of this email as sensibly warranted. -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

