On Monday, August 25, 2003, at 11:57 am, Erik Hofman wrote:

As a unix user the first thing that comes to my mind is off course tar and gzip (or maybe bzip2). I am aware of the limitations of the tar format, but the scan once for a TOC method seemed fast enough for me.

For very large archives, I contend this is not the case, and FG's startup performance is already, uh, poor. Pulling a 100Mb or 200Mb archive off the disk and through memory is going to hit any machine hard.


Regarding ZIP files, is it legal to use the compression algorithm without any limitations at the moment (for example GIF has a similar issue).

ZIP can use a range of compression schemes (including BZIP2!), and only one of them (compress) is covered by the LZW patent, which in any case expired in the US at the end of June. (and in europe / japan early next year) I suspect that encoders deliberately avoid this scheme, or don't support it, to avoid the issue. Notably, the CrystalSpace code only appears to support the common deflate / implode methods, but since I personally have used various 'random' ZIPs with crystal space, I assume these are the widely used methods.


Presumably, this also means the code could be extended to support the BZip2 format, by linking with libbzip2 (which is widely deployed on Linux, at least), and we'd get much smaller zip archives. I've no idea what spectrum of Zip readers, eg Nautilus, Konqueror, or WinZip or WinXP's builtin ZIP-as-a-folder mode, support that encoding.

H&H
James

--
We are all in the gutter, but some of us are looking at the stars.


_______________________________________________ Flightgear-devel mailing list [EMAIL PROTECTED] http://mail.flightgear.org/mailman/listinfo/flightgear-devel

Reply via email to