On Feb 4, 2008 2:27 PM, Peter Tribble <[EMAIL PROTECTED]> wrote:
> > That's a nifty name, and
> > your reasoning behind it seems quite sound. However, I think I still
> > prefer repositories to host individual files as they do now instead of
> > big archives.
>
> Well, one snag with individual files is that you end up with very
> high latencies per request.

That all depends on how you handle the requests for each file.

I agree that if you're requesting each file that you would have high
latency; especially on a connection across an ocean (e.g. AU to US).

However, I think that can be mitigated to an acceptable degree by
efficient communication between the server and client.

> If people will indulge me for a while:
>
> One of the reasons for choosing zip is that it's universally available,
> and has bindings to many languages. Yes, there are other
> compression schemes, but are they likely to installed out of the box
> on most machines you'll ever use?

This is the best reason I've seen to choose zip over lzma, etc. so
far. The ubiquity of the tools for the zip format and its populate
almost ensures packagers and users will be able to bend it to their
own nefarious purposes with ease.

However, if an alternative compression algorithm delivers better
performance and better compression (noticeably so), I would be
inclined to favour it, or at least make it the default.

> Another reason for using something that exists already is that I
> absolutely don't want to be forced to use specific tools to manipulate
> a package. I see it as a great advantage that I could open up a
> zap file one pretty much any box in existence and look inside.
> Another part of this is that simply unzipping the archive gives
> me the files that are runnable without installation (it may be possible
> to avoid having to supply software in both packaged and unpackaged
> formats [eg. java where you can get either SVR4 packages or self
> extracting executables] by structuring the files inside the archive
> appropriately).
>
> (The wish to have fully functional files upon simple unpacking
> argues against a tar file containing compressed files; it also
> means you can't use the mechanism on the installation
> media of the files being wrapped up again inside a cpio
> archive.)

I must admit, being able to provide a file that allows a user to
either simply unzip it somewhere and run the application (possibly) or
use it as a package sounds quite beneficial to me.

As a developer, I certainly would appreciate the lessening of software
distribution burdens.

> If you are after optimized operations (such as only getting that
> subset of the files in a package that you want) then I suspect
> you would actually want to talk to an IPS server and engage in
> a conversation with it. But I would have the server having fully
> populated zap files on its disk and spitting back an optimized
> zap file back to the client :-)

I'm not so sure about this personally.

I think I'd rather serve my repositories from a zfs, compressed
filesystem as individual files and have the option of serving
"on-the-fly" zap compressed files back to the client.

Having the files that comprise a package separated out individually
for repository purposes opens up some interesting possibilities for
load-balancing content serving.

While I know it's possible to do that with byte ranges of files, I
can't help but think it is far easier to do it using individual files.

-- 
Shawn Walker, Software and Systems Analyst
http://binarycrusader.blogspot.com/

"To err is human -- and to blame it on a computer is even more so." -
Robert Orben
_______________________________________________
pkg-discuss mailing list
[email protected]
http://mail.opensolaris.org/mailman/listinfo/pkg-discuss

Reply via email to