> Not at all.  Most things that get compressed are code files and they don't
> come with any particular information that you need in order to run them
> (they have a start address, but it isn't strictly necessary.  Some existing
> compressors such as powercrunch (which isn't that powerful at all) make you
> specify the starting address for obvious reasons.  In any case, the gzip
> header might well be able to contain the filetype information.

I was working on a gzip-like (using LWZ compression) compressor/decompressor,
but due to lack of time I never finished them. I believe that at least
one of them produces the correct result now. If I get thim, I'll look
into them....I remember that the big trick was to read and write a byte
to a file on the disc - I found no way of doing that in a simple mc-
way, so currently I'm using BASIC calls ;)

> >       LHARC could be used to do that much simpler and in less space and in 
> > one
> > process!
> 
> I don't know what LHARC is.  Why is there such a large variety of file
> formats around (lbr, arc/ark, lha, hqx, zip, zoo, lzh, several forms
> of lzw, tar, ...)? It seems that every time someone wants to archive
> something they invent a new format for it instead of using one of the
> commonly available ones.  Anyway, the compression rate of ".tar.gz" is
> hard to beat, so if you want to make everyone use lharc you'd better be
> sure of what you are doing...

True.....and mine is yet another one ;)

> 
> > Obviously the multi-task facility would be functional but NOT for general 
> > use. 
> > It'd normally only get used if say a remote user was logged in and the 
> > operator (me) needed to do somthing important!
> 
> A remote user logged in to a Sam?  Get real!
> 
> >                                                otherwise it'd be only 
> > running
> > a single task but using the unix parent/child processing to seriously 
> > simplify 
> > the linking of programs actions to make a powerful if not blindingly fast 
> > system:-)
> 
> You can't make a useable Unix process hierarchy on a Sam.  It hasn't got
> basic memory allocation facilities.  Sure, if all your programs are below
> 32K then you can page them in and out without too much trouble.  But how
> are you going to run programs which are over 32K?  There's no way you can
> prevent one process from reading or scribbling over another process's memory
> and/or screwing up the entire system.

Is't this the way DRiVER is working?? I was thinking about a way of
'multiprogramming' in SAM by using 16/32K program-blocks. If you needed
bigger programs, you had to call a 'system-routine' that paged in and
out the needed 16/32K chunks of memory and making the data-space a part
of the 16/32K chunks.

> 
> Sorry, but if I want a useable Unix system I'll buy a 486 or a 68040...

I think I'd buy a Pentium or a DEC AIX/OSF...........
-- 
* Frode Tennebo                         * It's better to live life in     *
* email: [EMAIL PROTECTED]              * wealth and die poor, than live  * 
* phone: +47 712 57716                  * life in poverty and die rich.   *
* snail: Parkv. 31, 6400 Molde, NORWAY  *                   -Frode Tennebo*

Reply via email to